3

I am using Tensorboard (as mentioned in here) to monitor the loss and accuracy curves of my model.

tbCallBack = keras.callbacks.TensorBoard(log_dir='./Graph', histogram_freq=0, write_graph=True, write_images=True)

While this does show the loss and accuracy after each epoch, I would like to find out whether it is possible to have this information several times per epoch, as my dataset is quite big.

This is how I train the model:

n_train # number of train samples
n_valid # number of validation samples
epochs # number of epochs
batch_size # batch size

model.fit_generator(
    generator=data_generator(X_train, Y_train, batch_sz=batch_size),
    steps_per_epoch=n_train//batch_size,
    validation_data=data_generator(X_valid, Y_valid, batch_sz=batch_size),
    validation_steps=n_valid//batch_size,
    epochs=epochs,
    callbacks=[tbCallBack]
)
lucasrodesg
  • 638
  • 1
  • 6
  • 22
  • 2
    You can define a new custom callback with a `on_batch_end` method which should write the logs from where tensorboard reads the data. And you can look at https://github.com/keras-team/keras/blob/master/keras/callbacks.py#L816 to see how keras writes after each epoch – Ankur Ankan Mar 17 '18 at 21:49

0 Answers0