From Keras documentation, we have for models.fit
method:
fit(x=None, y=None,
batch_size=None,
epochs=1,
verbose=1,
callbacks=None,
validation_split=0.0, validation_data=None,
shuffle=True,
class_weight=None,
sample_weight=None,
initial_epoch=0,
steps_per_epoch=None,
validation_steps=None
)
'val_loss' is recorded if validation is enabled in fit, and val_accis recorded if validation and accuracy monitoring are enabled.
- This is from the keras.callbacks.Callback() object, if used for callbacks parameter in the above fit method.
Instead of using the history callback, which you've used, it can be used as follows:
from keras.callbacks import Callback
logs = Callback()
model.fit(train_data,
train_labels,
epochs = 64,
batch_size = 10,
shuffle = True,
validation_split = 0.2,
callbacks=[logs]
)
'val_loss' is recorded if validation is enabled in fit
means: when using the model.fit method you are using either the validatoin_split
parameter or you use validation_data
parameter to specify the tuple (x_val, y_val) or tuple (x_val, y_val, val_sample_weights) on which to evaluate the loss and any model metrics at the end of each epoch.
.
A History object. Its History.history attribute is a record of
training loss values and metrics values at successive epochs, as well
as validation loss values and validation metrics values (if
applicable). - Keras Documentation ( Return value for model.fit method)
You are using the History callback, in your model as follows:
model.fit(train_data,
train_labels,
epochs = 64,
batch_size = 10,
shuffle = True,
validation_split = 0.2,
callbacks=[history]
)
history.history will output a dictionary for you with the : loss
, acc
, val_loss
and val_acc
, if you use a variable for saving model.fit like below:
history = model.fit(
train_data,
train_labels,
epochs = 64,
batch_size = 10,
shuffle = True,
validation_split = 0.2,
callbacks=[history]
)
history.history
The output will be like the following:
{'val_loss': [14.431451635814849,
14.431451635814849,
14.431451635814849,
14.431451635814849,
14.431451635814849,
14.431451635814849,
14.431451635814849,
14.431451635814849,
14.431451635814849,
14.431451635814849],
'val_acc': [0.1046428571712403,
0.1046428571712403,
0.1046428571712403,
0.1046428571712403,
0.1046428571712403,
0.1046428571712403,
0.1046428571712403,
0.1046428571712403,
0.1046428571712403,
0.1046428571712403],
'loss': [14.555215610322499,
14.555215534028553,
14.555215548560733,
14.555215588524229,
14.555215592157273,
14.555215581258137,
14.555215575808571,
14.55521561940511,
14.555215563092913,
14.555215624854679],
'acc': [0.09696428571428571,
0.09696428571428571,
0.09696428571428571,
0.09696428571428571,
0.09696428571428571,
0.09696428571428571,
0.09696428571428571,
0.09696428571428571,
0.09696428571428571,
0.09696428571428571]}
You can save the data both by using csvlogger
like below as given in the comments or by using the longer method of writing a dictionary to a csv file as given here writing a dictionary to a csv
csv_logger = CSVLogger('training.log')
model.fit(X_train, Y_train, callbacks=[csv_logger])