3

Whenever I tune my neural network, I usually take the common approach of defining some layers with some neurons.

  • If it overfits, I reduce the layers, neurons, add dropout, utilize regularisation.

  • If it underfits, I do the other way around.

But it sometimes feels illogical doing all these. So, is there a more principled way of tuning a neural network (i.e. find the optimal number of layers, neurons, etc., in a principled and mathematical sound way), in case it overfits or underfits?

nbro
  • 42,615
  • 12
  • 119
  • 217
Fasty
  • 151
  • 5

0 Answers0