If the entropy of information reaches its maximum when all values of X are equally probable, indicating maximum uncertainty, do probabilities that don't vary much from the average contribute to greater entropy?
Asked
Active
Viewed 46 times
1 Answers
0
If you have a finite set of possibilities (such as 0/1 for binary), the entropy is maximized, given that the probability of observing one or the other is 0.5, and nothing can increase it further
If you have a distribution that is close to the uniform, your entropy is high, but it's not maximized
Alberto
- 2,863
- 5
- 12