For questions related to the Jensen–Shannon divergence, which is a measure of the similarity between two probability distributions. The JS divergence is based on the Kullback–Leibler divergence.
Questions tagged [jensen-shannon-divergence]
1 questions
5
votes
1 answer
Why is the Jensen-Shannon divergence preferred over the KL divergence in measuring the performance of a generative network?
I have read articles on how Jensen-Shannon divergence is preferred over Kullback-Leibler in measuring how good a distribution mapping is learned in a generative network because of the fact that JS-divergence better measures distribution similarity…
ashenoy
- 1,419
- 6
- 19