Keras Not Training on Entire Dataset

Keras not training on entire dataset

The number 1875 shown during fitting the model is not the training samples; it is the number of batches.

model.fit includes an optional argument batch_size, which, according to the documentation:

If unspecified, batch_size will default to 32.

So, what happens here is - you fit with the default batch size of 32 (since you have not specified anything different), so the total number of batches for your data is

60000/32 = 1875

Model fitting doesn't use all of the provided data

The model is being trained with a batchsize of 32, hence there are 60,000/32 = 1875 batches.


Despite tensorflow documentation shows batch_size=None in the fit function overview, the information about this argument says:

batch_size: Integer or None. Number of samples per gradient update. If unspecified, batch_size will default to 32. Do not specify the batch_size if your data is in the form of datasets, generators, or keras.utils.Sequence instances (since they generate batches).



Related Topics



Leave a reply



Submit