Here's a simple problem that came up yesterday when I was looking for a quick bound on entropy ... I thought it would be a nice warmup to kick off the term:

Let A and B be independent random variables. WLOG, suppose H(A) <= H(B).

Show that H(A) <= H(B) <= H(A+B) <= H(A)+H(B).

Post your answer in the comments.

## 4 comments:

H(A+B, A) = H(A) + H(A+B|A)

= H(A) + H(B)

Therefore, H(A+B) <= H(A) + H(B) ---(1)

Also,

H(A+B,A) = H(A+B) + H(A|A+B).

=>

H(A+B) + H(A|A+B) = H(A) + H(B)

H(A+B) + H(A) >= H(A) + H(B)

H(A+B) >= H(B) ----(2).

Although I am able to "work through" the expressions, I dont get the intuition for H(A+B)>=H(B). It's just that I tried a couple of inequalities at random and one of them clicked... how do we go about picking the approach?

On thinking some more, I am now not sure if the assertion is correct. Is

H(A+B) >= max(H(A),H(B)) always?

Let A and B have the following joint dist:

Pr(A=1, B=-1) = 1/2

Pr(A=-1, B=1) = 1/2.

Then, A+B is always 0, so H(A+B)=0.

However, H(A) and H(B) are both 1.

Now, going back to the proof I had in the earlier comment, it seems like the "hand-waving" assumption was:

H(A+B|A) = H(B). While it is fair to say H(A+B|A) <= H(B), the equality is not always true.

Ha, I just saw ur problem statement: A and B are independent!

You got it. As far as intuition is concerned, I prefer to solve H(A+B) >= H(B) by noting that H(A+B|A) = H(B) (which is part of your solution), and then realizing that H(A+B|A) <= H(A+B) (because side information always reduces entropy).

Put less formally, if you know A+B and A, then you know B for sure. But if you A+B, then you need more information to describe it than either A or B alone.

Post a Comment