Is binary cross entropy commutative? Both in math theory and in PyTorch?
I'm not fresh enough on my math to tell straight by looking at the formula and code.
BCE(x,y) =?= BCE(y,x)
Is binary cross entropy commutative? Both in math theory and in PyTorch?
I'm not fresh enough on my math to tell straight by looking at the formula and code.
BCE(x,y) =?= BCE(y,x)
Binary cross-entropy is not commutative, and the two elements typically used with it in machine learning - ground truth $y$ and estimate of class probability $\hat{y}$ - are not interchangeable conceptually either.
The BCE loss function is:
$$\mathcal{L}(y,\hat{y}) = -y\text{log}(\hat{y}) - (1-y)\text{log}(1-\hat{y})$$
Ground truth $y$ is commonly either $0$ or $1$ for data records where an individual example is either a class member or not. If that value was used in place of $\hat{y}$ then one of logs would be $\text{log}(0)$ which is undefined.
You can generate example outputs when both $y$ and $\hat{y}$ are between $0$ and $1$, and see that the loss function is not symmetric: