2

The Renyi entropy of order $\beta$, for a discrete probability distribution $p$ is given by \begin{equation} H_{\beta}(p) = \frac{1}{1 - \beta} ~\log \left( \sum_{i \in S} p(i)^{\beta} \right), \end{equation} where $S$ is the set of all strings in the support of $p$.

As is mentioned here, for two discrete distributions $p$ and $r$ the Renyi entropy of the product distribution $p \times r$ is

\begin{equation} H_{\beta}(p \times r) = H_{\beta}(p) + H_{\beta}(r). \end{equation}

What might be a proof of this fact?

glS
  • 27,510
  • 7
  • 37
  • 125
BlackHat18
  • 1,527
  • 9
  • 22

1 Answers1

5

This only holds if the two distributions are independent. In this case $$ \begin{aligned} H_{\beta}(p \times q) &= \frac{1}{1-\beta} \log\left( \sum_{i,j}(p(i) q(j))^{\beta} \right) \\ &= \frac{1}{1-\beta} \log\left( \left(\sum_{i}p(i)^{\beta}\right) \left(\sum_jq(j)^{\beta}\right) \right) \\ &= \frac{1}{1-\beta} \left(\log \left(\sum_{i}p(i)^{\beta}\right) + \log \left(\sum_{j}q(j)^{\beta}\right)\right) \\ &= H_{\beta}(p) + H_{\beta}(q) . \end{aligned} $$

Rammus
  • 6,808
  • 1
  • 9
  • 31