1

I've been trying to find the optimal number of epochs that I should train my neural network (that I just implemented) for.

The visualizations below show the neural network being run with a variable number of epochs. It is quite obvious that the accuracy increases with the number of epochs. However, at 75 epochs, we see a dip before the accuracy continues to rise. What is the cause of this?

enter image description here

nbro
  • 42,615
  • 12
  • 119
  • 217
eGood
  • 11
  • 2

1 Answers1

1

Decrease of loss does not essentially lead to increase of accuracy (most of the time it happens but sometime it may not happen). To know why, you can have a look at this question. The network cares about decreasing the loss and it does not care about the accuracy at all. So it's no surprise to see what you presented.

Additional note: If you use batch approaches to teach your network, or if you choose a big step size you may also see that the loss increases sometimes but in the case of batching, the hope is that its trend is to be decreased.

amin
  • 430
  • 2
  • 11