Abstract

We show that the entropy of the sum of independent random variables can greatly differ from the entropy of their difference. The gap between the two entropies can be arbitrarily large. This holds for regular entropies as well as differential entropies. Our results rely heavily on a result of Ruzsa, who studied sums and differences of finite sets.