what is EPOCH in neural network
I want EPOCH definition.
EPOCH is to update the weights.
So How does it work?
Change the "Training data(Input data)"?
Change the "Delta rule(Activation functions)"?
what is EPOCH in neural network
I want EPOCH definition.
EPOCH is to update the weights.
So How does it work?
Change the "Training data(Input data)"?
Change the "Delta rule(Activation functions)"?
This comes in the context of training a neural network with gradient descent. Since we usually train NNs using stochastic or mini-batch gradient descent, not all training data is used at each iterative step.
Stochastic and mini-batch gradient descent use a batch_size number of training examples at each iteration, so at some point you will have used all data to train and can start over from the beginning of the dataset.
Considering that, one epoch is one complete pass through the whole training set, means it is multiple iterations of gradient descent updates until you show all the data to the NN, and then start again.
To put it really simple:
Epoch is a function in which everything happens. Within one epoch, you start forward propagation and back propagation. Within one epoch you make neuron activate, calculate loss, get partial derivatives of loss function and you update new values with your weights. And when all these is done, you start new epoch, and then new one etc. The number of epochs is not really important. What matter is the change of your loss function, derivatives etc. So when you are happy with results, you can stop with epoch iteration and get your model out :)
Epoches is single pass through whole training dataset. Traditional Gradient Descent computes the gradient of the loss function with regards to parameters for the entire training data set for a given number of epochs.