Keras not training on entire dataset
The number 1875
shown during fitting the model is not the training samples; it is the number of batches.
model.fit
includes an optional argument batch_size
, which, according to the documentation:
If unspecified,
batch_size
will default to 32.
So, what happens here is - you fit with the default batch size of 32 (since you have not specified anything different), so the total number of batches for your data is
60000/32 = 1875
Model fitting doesn't use all of the provided data
The model is being trained with a batchsize of 32, hence there are 60,000/32 = 1875
batches.
Despite tensorflow documentation shows batch_size=None
in the fit
function overview, the information about this argument says:
batch_size
: Integer or None. Number of samples per gradient update. If unspecified, batch_size will default to 32. Do not specify the batch_size if your data is in the form of datasets, generators, or keras.utils.Sequence instances (since they generate batches).
Related Topics
Pandas: Rolling Mean by Time Interval
Matplotlib: Format Axis Offset-Values to Whole Numbers or Specific Number
Python Glob Multiple Filetypes
Regular Expression Matching a Multiline Block of Text
Typeerror: Unhashable Type: 'Dict'
How to Construct a Timedelta Object from a Simple String
How to Get the Input from the Tkinter Text Widget
How to Remove Non-Ascii Characters But Leave Periods and Spaces
Python: Access Class Property from String
Opencv Giving Wrong Color to Colored Images on Loading
Make Sure Only a Single Instance of a Program Is Running
Pelican 3.3 Pelican-Quickstart Error "Valueerror: Unknown Locale: Utf-8"
How to Check If a Process Is Still Running Using Python on Linux
How to Read Realtime Microphone Audio Volume in Python and Ffmpeg or Similar
Python: Get Output of the Shell Command 'History'
Conda Command Will Prompt Error: "Bad Interpreter: No Such File or Directory"