-1

If the entropy of information reaches its maximum when all values of X are equally probable, indicating maximum uncertainty, do probabilities that don't vary much from the average contribute to greater entropy?

nbro
  • 42,615
  • 12
  • 119
  • 217
MAIKAO
  • 1

1 Answers1

0

If you have a finite set of possibilities (such as 0/1 for binary), the entropy is maximized, given that the probability of observing one or the other is 0.5, and nothing can increase it further

If you have a distribution that is close to the uniform, your entropy is high, but it's not maximized

Alberto
  • 2,863
  • 5
  • 12