It's a well-known theorem of information theory that, of all additive noise channels with constant noise variance (i.e., constant noise power), the Gaussian channel has the minimum capacity. I learned this long ago, and over time the concept was distilled in my memory down to "Gaussian is the worst noise".
I was trying to use this "property" to find a quick lower bound for the capacity of a new binary-input channel. But after some funny results, my grad student found an old paper of Shamai and VerdĂș, which -- much to my surprise and dismay -- shows that Gaussian noise is not worst-case if the input distribution is constrained! In particular, if the input is binary {+1,-1}, the worst constant-power distribution is supported on the discrete set {...,-2,0,+2,...}, and can't generally be expressed in closed form.
For more details, see:
S. Shamai and S. VerdĂș, "Worst-case power-constrained noise for binary-input channels," IEEE Trans. Info. Theory, vol. 38, no. 5, pp. 1494--1511, 1992.