I am using Tensorboard (as mentioned in here) to monitor the loss and accuracy curves of my model.
tbCallBack = keras.callbacks.TensorBoard(log_dir='./Graph', histogram_freq=0, write_graph=True, write_images=True)
While this does show the loss and accuracy after each epoch, I would like to find out whether it is possible to have this information several times per epoch, as my dataset is quite big.
This is how I train the model:
n_train # number of train samples
n_valid # number of validation samples
epochs # number of epochs
batch_size # batch size
model.fit_generator(
generator=data_generator(X_train, Y_train, batch_sz=batch_size),
steps_per_epoch=n_train//batch_size,
validation_data=data_generator(X_valid, Y_valid, batch_sz=batch_size),
validation_steps=n_valid//batch_size,
epochs=epochs,
callbacks=[tbCallBack]
)