As the title said, validation set that is affecting weights. But, it might not be what you think.
While the training set is affecting weights based on sample with backpropagation like these steps:
- Forward (inference) a train sample input to the architecture.
- Get inferred value.
- Compare the inferred value with the training sample ground truth, returning the loss value.
- Backpropagate the loss value as a reference to update weight state.
- Repeat to number one for the next train sample.
The validation set is affecting weights from a higher level perspective based on epoch and no gradient descent like these steps:
- Save the weights of initial epoch.
- If the validation set performance of current epoch is lower than previous epoch, then restore the previous weights.
- Otherwise, save the current weights.
- Repeat for the next epoch.
So, it's affecting weights over epochs, not over batches or samples. So, is the validation set still considered a validation set?