Questions tagged [epochs]

In terms of artificial neural networks, an epoch refers to the cycles through the full training dataset. Usually, training a neural network takes more than a few epochs. ... With a neural network, the goal of the model is generally to classify or generate material which is right or wrong.

9 questions
6
votes
2 answers

How to shorten the development time of a neural network?

I am developing an LSTM for sequence tagging. During the development, I do various changes in the system, for example, add new features, change the number of nodes in the hidden layers, etc. After each change, I check the accuracy using…
4
votes
1 answer

What is your training time of Resnet-18/Resnet-50 on Imagenet?

My training of Resnet-18 network on Imagenet using Tesla V100 seems to be quite slow (1 epoch is about 2,5 hours, batch 128). Increasing the number of GPUs does not seem to help. What is your training time of Resnet-18/Resnet-50 on Imagenet? How…
cerebrou
  • 161
  • 1
  • 3
2
votes
1 answer

What is the reason we loop over epochs when training a neural network?

After reading through this thread and some other resources online, I still do not understand the role of epochs in training a neural network. I understand that one epoch is one iteration through the entire data set. But I don't understand what…
2
votes
1 answer

When is the loss calculated, and when does the back-propagation take place?

I read different articles and keep getting confused on this point. Not sure if the literature is giving mixed information or I'm interpreting it incorrectly. So from reading articles my understanding (loosely) for the following terms are as…
1
vote
2 answers

Is there any relationship between the batch size and the number of epochs?

I am currently running a program with a batch size of 17 instead of batch size 32. The benchmark results are obtained at a batch size of 32 with the number of epochs 700. Now I am running with batch size 17 with unchanged number epochs. So I am…
hanugm
  • 4,102
  • 3
  • 29
  • 63
1
vote
1 answer

Why does the accuracy drop while the loss decrease, as the number of epochs increases?

I've been trying to find the optimal number of epochs that I should train my neural network (that I just implemented) for. The visualizations below show the neural network being run with a variable number of epochs. It is quite obvious that the…
eGood
  • 11
  • 2
0
votes
1 answer

Low validation loss from the first epoch?

The initial validation loss is low from the first epoch and then decreases slightly. What does this actually mean? Does it indicate that the model can effectively and quickly identify patterns for this task? I can see that the model works in…
RT.
  • 101
0
votes
1 answer

What does it mean if I trained my model with more steps per epoch than the total number of training images I have?

I'm having a little bit of trouble understanding what steps per epoch really means. I've read that Number of Steps per Epoch = (Total Number of Training Samples) / (Batch Size), however I don't understand at which point in the training this…
0
votes
1 answer

Does an increase in the number of epochs lead to complete breakdown?

Recently, I ran a code on my system that involves deep neural networks. The number of epochs provided by the designers are 301. I tried to increase the number of epochs to 501. To my shock, the model after 350 epochs is behaving eccentric. And I can…
hanugm
  • 4,102
  • 3
  • 29
  • 63