2

Is binary cross entropy commutative? Both in math theory and in PyTorch?

I'm not fresh enough on my math to tell straight by looking at the formula and code.

BCE(x,y) =?= BCE(y,x)

Thomas Eding
  • 123
  • 3

1 Answers1

5

Binary cross-entropy is not commutative, and the two elements typically used with it in machine learning - ground truth $y$ and estimate of class probability $\hat{y}$ - are not interchangeable conceptually either.

The BCE loss function is:

$$\mathcal{L}(y,\hat{y}) = -y\text{log}(\hat{y}) - (1-y)\text{log}(1-\hat{y})$$

Ground truth $y$ is commonly either $0$ or $1$ for data records where an individual example is either a class member or not. If that value was used in place of $\hat{y}$ then one of logs would be $\text{log}(0)$ which is undefined.

You can generate example outputs when both $y$ and $\hat{y}$ are between $0$ and $1$, and see that the loss function is not symmetric:

  • $\mathcal{L}(0.5,0.9) \approx 1.204$
  • $\mathcal{L}(0.9,0.5) \approx 0.693$
Neil Slater
  • 33,739
  • 3
  • 47
  • 66