Tuesday, December 13, 2022

Epochs and batches

We provide epoch value while fitting/training the model as below.

Example: model.fit(X,Y,epoch=100) 

Epochs and batches

In the fit statement above, the number of epochs was set to 100. This specifies that the entire data set 

should be applied during training 100 times. During training, you see output describing the progress of 

training that looks like this:

Epoch 1/100
157/157 [==============================] - 0s 1ms/step - loss: 2.2770

The first line, Epoch 1/100, describes which epoch the model is currently running. For efficiency,

the training data set is broken into 'batches'. The default size of a batch in Tensorflow is 32. 

if given an model has are 5000 examples(X_train) it will set or roughly to 157 batches. 

The notation on the 2nd line 157/157 [==== is describing which batch has been executed.

Loss (cost)

Ideally, the cost will decrease as the number of iterations of the algorithm increases. Tensorflow refers to 

the cost as loss

No comments:

Post a Comment