160

I have built a neural network with Keras. I would visualize its data by Tensorboard, therefore I have utilized:

keras.callbacks.TensorBoard(log_dir='/Graph', histogram_freq=0,
                            write_graph=True, write_images=True)

as explained in keras.io. When I run the callback I get <keras.callbacks.TensorBoard at 0x7f9abb3898>, but I don't get any file in my folder "Graph". Is there something wrong in how I have used this callback?

today
  • 32,602
  • 8
  • 95
  • 115
Simone
  • 4,800
  • 12
  • 30
  • 46
  • 3
    I would suggest setting `histogram_freq` to `1`. "histogram_freq: frequency (in epochs) at which to compute activation histograms for the layers of the model. If set to 0, histograms won't be computed." – Matt Kleinsmith Mar 12 '17 at 07:29
  • 14
    Be careful: "/Graph" makes a directory in the root directory, while "./Graph" makes one in the working directory. – Matt Kleinsmith Mar 12 '17 at 07:31
  • @MattKleinsmith If set to 0, only activation and weight histograms for the layers of the model won't be computed via Validation data, metrics still will be logged. – BugKiller Aug 21 '18 at 15:03
  • I think it's better to give unique name to logdir look at https://stackoverflow.com/a/54949146/1179925 – mrgloom Mar 01 '19 at 17:45

10 Answers10

241
keras.callbacks.TensorBoard(log_dir='./Graph', histogram_freq=0,  
          write_graph=True, write_images=True)

This line creates a Callback Tensorboard object, you should capture that object and give it to the fit function of your model.

tbCallBack = keras.callbacks.TensorBoard(log_dir='./Graph', histogram_freq=0, write_graph=True, write_images=True)
...
model.fit(...inputs and parameters..., callbacks=[tbCallBack])

This way you gave your callback object to the function. It will be run during the training and will output files that can be used with tensorboard.

If you want to visualize the files created during training, run in your terminal

tensorboard --logdir path_to_current_dir/Graph 
starball
  • 20,030
  • 7
  • 43
  • 238
Nassim Ben
  • 11,473
  • 1
  • 34
  • 52
  • I used this with following error when write_images=False – abdul qayyum May 22 '17 at 22:49
  • InvalidArgumentError (see above for traceback): Tensor must be 4-D with last dim 1, 3, or 4, not [1,3,3,256,256,1] [[Node: conv_3.2_2/kernel_0_1 = ImageSummary[T=DT_FLOAT, bad_color=Tensor, max_images=3, _device="/job:localhost/replica:0/task:0/cpu:0"](conv_3.2_2/kernel_0_1/tag, ExpandDims_50)]] – abdul qayyum May 22 '17 at 22:49
  • And something saying placeholder is missing dtype = float when True Any Idea? – abdul qayyum May 22 '17 at 22:50
  • 2
    The Scalars tab is still empty, although I can see my model architecture on the Graphs tab? – Irtaza May 24 '17 at 11:18
  • 1
    this only produces scalars for training loss & accuracy. how do you do the same for the validation_data which is passed to the fit function? – Utku Ufuk Mar 27 '18 at 07:20
  • important to note is that `histogram_freq=0` is set if tensorboard doesn't log any histogram by `tf.summary.histogram` - otherwise `histogram_freq` does NOT equal 0! – Agile Bean Apr 29 '19 at 14:38
  • Is there a way to see the TensorBoard graph as live? – Benyamin Jafari Nov 14 '19 at 08:13
50

This is how you use the TensorBoard callback:

from keras.callbacks import TensorBoard

tensorboard = TensorBoard(log_dir='./logs', histogram_freq=0,
                          write_graph=True, write_images=False)
# define model
model.fit(X_train, Y_train,
          batch_size=batch_size,
          epochs=nb_epoch,
          validation_data=(X_test, Y_test),
          shuffle=True,
          callbacks=[tensorboard])
today
  • 32,602
  • 8
  • 95
  • 115
Martin Thoma
  • 124,992
  • 159
  • 614
  • 958
  • 2
    Is there a way to structure the output of tensorboard better? Does Keras do some optimization in that regard? – Nickpick Jul 16 '17 at 23:33
  • 2
    @nickpick I don't know what you mean. But I think this might be a candidate for another question. – Martin Thoma Jul 25 '17 at 16:13
  • here we go: https://stackoverflow.com/questions/45309153/structure-a-keras-tensorboard-graph – Nickpick Jul 25 '17 at 16:41
  • important to note is that `histogram_freq=0` is set if tensorboard doesn't log any histogram by `tf.summary.histogram` - otherwise `histogram_freq` does NOT equal 0! – Agile Bean Apr 29 '19 at 14:39
21

Change

keras.callbacks.TensorBoard(log_dir='/Graph', histogram_freq=0,  
          write_graph=True, write_images=True)

to

tbCallBack = keras.callbacks.TensorBoard(log_dir='Graph', histogram_freq=0,  
          write_graph=True, write_images=True)

and set your model

tbCallback.set_model(model)

Run in your terminal

tensorboard  --logdir Graph/
Leandro Souza
  • 319
  • 2
  • 3
16

If you are working with Keras library and want to use tensorboard to print your graphs of accuracy and other variables, Then below are the steps to follow.

step 1: Initialize the keras callback library to import tensorboard by using below command

from keras.callbacks import TensorBoard

step 2: Include the below command in your program just before "model.fit()" command.

tensor_board = TensorBoard(log_dir='./Graph', histogram_freq=0, write_graph=True, write_images=True)

Note: Use "./graph". It will generate the graph folder in your current working directory, avoid using "/graph".

step 3: Include Tensorboard callback in "model.fit()".The sample is given below.

model.fit(X_train,y_train, batch_size=batch_size, epochs=nb_epoch, verbose=1, validation_split=0.2,callbacks=[tensor_board])

step 4 : Run your code and check whether your graph folder is there in your working directory. if the above codes work correctly you will have "Graph" folder in your working directory.

step 5 : Open Terminal in your working directory and type the command below.

tensorboard --logdir ./Graph

step 6: Now open your web browser and enter the address below.

http://localhost:6006

After entering, the Tensorbaord page will open where you can see your graphs of different variables.

Andrew
  • 26,706
  • 9
  • 85
  • 101
Sunil Sharma
  • 249
  • 3
  • 8
  • important to note is that `histogram_freq=0` is set if tensorboard doesn't log any histogram by `tf.summary.histogram` - otherwise `histogram_freq` does NOT equal 0! – Agile Bean Apr 29 '19 at 14:38
11

Here is some code:

K.set_learning_phase(1)
K.set_image_data_format('channels_last')

tb_callback = keras.callbacks.TensorBoard(
    log_dir=log_path,
    histogram_freq=2,
    write_graph=True
)
tb_callback.set_model(model)
callbacks = []
callbacks.append(tb_callback)

# Train net:
history = model.fit(
    [x_train],
    [y_train, y_train_c],
    batch_size=int(hype_space['batch_size']),
    epochs=EPOCHS,
    shuffle=True,
    verbose=1,
    callbacks=callbacks,
    validation_data=([x_test], [y_test, y_test_coarse])
).history

# Test net:
K.set_learning_phase(0)
score = model.evaluate([x_test], [y_test, y_test_coarse], verbose=0)

Basically, histogram_freq=2 is the most important parameter to tune when calling this callback: it sets an interval of epochs to call the callback, with the goal of generating fewer files on disks.

So here is an example visualization of the evolution of values for the last convolution throughout training once seen in TensorBoard, under the "histograms" tab (and I found the "distributions" tab to contain very similar charts, but flipped on the side):

tensorboard weights monitoring

In case you would like to see a full example in context, you can refer to this open-source project: https://github.com/Vooban/Hyperopt-Keras-CNN-CIFAR-100

Kings85
  • 345
  • 2
  • 15
Guillaume Chevalier
  • 9,613
  • 8
  • 51
  • 79
  • I downvoted this because a large part of this is actually questions and not an answer to the question. Don't ask new questions in answers, whether it is a part or the entire purpose of an answer. – Zoe Oct 23 '17 at 18:02
  • I edited the question to remove what you mentionned. In fact, this callback is very hard to use properly from the documentation at the time I answered. – Guillaume Chevalier Oct 25 '17 at 04:13
  • To answer "How do I use the TensorBoard callback of Keras?", all the other answers are incomplete and respond only to the small context of the question - no one tackles embeddings for example. At least, I had documented potential errors or things to avoid in my answer. I think I raised important questions that no one even deems to think about yet. I am still waiting for for a complete answer. This callback is ill-documented, too, like cancer. – Guillaume Chevalier Oct 25 '17 at 04:21
5

If you are using google-colab simple visualization of the graph would be :

import tensorboardcolab as tb

tbc = tb.TensorBoardColab()
tensorboard = tb.TensorBoardColabCallback(tbc)


history = model.fit(x_train,# Features
                    y_train, # Target vector
                    batch_size=batch_size, # Number of observations per batch
                    epochs=epochs, # Number of epochs
                    callbacks=[early_stopping, tensorboard], # Early stopping
                    verbose=1, # Print description after each epoch
                    validation_split=0.2, #used for validation set every each epoch
                    validation_data=(x_test, y_test)) # Test data-set to evaluate the model in the end of training
DINA TAKLIT
  • 7,074
  • 10
  • 69
  • 74
4

Create the Tensorboard callback:

from keras.callbacks import TensorBoard
from datetime import datetime
logDir = "./Graph/" + datetime.now().strftime("%Y%m%d-%H%M%S") + "/"
tb = TensorBoard(log_dir=logDir, histogram_freq=2, write_graph=True, write_images=True, write_grads=True)

Pass the Tensorboard callback to the fit call:

history = model.fit(X_train, y_train, epochs=200, callbacks=[tb])

When running the model, if you get a Keras error of

"You must feed a value for placeholder tensor"

try reseting the Keras session before the model creation by doing:

import keras.backend as K
K.clear_session()
rsc
  • 10,348
  • 5
  • 39
  • 36
2

You wrote log_dir='/Graph' did you mean ./Graph instead? You sent it to /home/user/Graph at the moment.

Part
  • 149
  • 1
  • 7
2

You should check out Losswise (https://losswise.com), it has a plugin for Keras that's easier to use than Tensorboard and has some nice extra features. With Losswise you'd just use from losswise.libs import LosswiseKerasCallback and then callback = LosswiseKerasCallback(tag='my fancy convnet 1') and you're good to go (see https://docs.losswise.com/#keras-plugin).

nicodjimenez
  • 1,180
  • 17
  • 15
  • 7
    Disclaimer: OP is the founder of Losswise, which is a paid product (although with a pretty generous free tier) – Michael Mior Dec 21 '17 at 17:52
  • @MichaelMior is correct, although it isn't a paid product yet and may never be (other than on prem licenses in the future maybe) – nicodjimenez Feb 17 '18 at 08:42
2

There are few things.

First, not /Graph but ./Graph

Second, when you use the TensorBoard callback, always pass validation data, because without it, it wouldn't start.

Third, if you want to use anything except scalar summaries, then you should only use the fit method because fit_generator will not work. Or you can rewrite the callback to work with fit_generator.

To add callbacks, just add it to model.fit(..., callbacks=your_list_of_callbacks)

Michael Mior
  • 28,107
  • 9
  • 89
  • 113
Andrey Nikishaev
  • 3,759
  • 5
  • 40
  • 55