Epoch
Definition
One complete pass of the training dataset through the algorithm.
Deep Dive
In the context of training machine learning models, particularly neural networks, an "epoch" represents one complete pass of the entire training dataset through the learning algorithm. During a single epoch, every example in the training set is presented to the model at least once, allowing the model to make predictions, calculate the error (loss), and update its internal parameters (weights and biases) based on that error. Because individual data points are often processed in smaller subsets called "batches" for computational efficiency, a full epoch consists of multiple iterations, where each iteration processes one batch.
Examples & Use Cases
- 1Training a neural network for image classification for 50 epochs, meaning the entire dataset is passed through the network 50 times
- 2Optimizing a deep reinforcement learning agent over 100 epochs, allowing it to explore and learn from the environment repeatedly
- 3Running a natural language processing model for 10 epochs to fine-tune its parameters on a specific text corpus