5

I am having a hard time to understand how to get Tensorboard work properly from a notebook running on Google Colab. I will post below a series of code snippets that I use to work with tensorboard.

TensorFlow version: 2.2.0
Eager mode: True
Hub version: 0.8.0
GPU is available

%load_ext tensorboard
import tensorflow as tf
from tensorboard.plugins.hparams import api as hp
callbacks = [
        
        EarlyStopping(monitor=monitor_metric,
                      min_delta=minimum_delta,
                      patience=patience_limit,
                      verbose=verbose_value,
                      mode=mode_value,
                      restore_best_weights=True),

        ModelCheckpoint(filepath=weights_fname,
                        monitor=monitor_metric,
                        verbose=verbose_value,
                        save_best_only=True,
                        save_weights_only=True),
        
        tf.keras.callbacks.TensorBoard(logdir), #used here

        TensorBoardColabCallback(tbc),
        
        hp.KerasCallback(logdir, hparams) #used here
    ]
    
    return callbacks

Initializing Hyper-parameters that will be logged by Tensorboard

HP_HIDDEN_UNITS = hp.HParam('batch_size', hp.Discrete([128]))
HP_EMBEDDING_DIM = hp.HParam('embedding_dim', hp.Discrete([50, 100]))
HP_LEARNING_RATE = hp.HParam('learning_rate', hp.Discrete([0.01])) # Adam default: 0.001, SGD default: 0.01, RMSprop default: 0.001
HP_DECAY_STEPS_MULTIPLIER = hp.HParam('decay_steps_multiplier', hp.Discrete([10, 100]))

METRIC_ACCURACY = "hamming_loss"

Write the hp parameters file to the logging directory of Tensorboard.

hp_logging_directory=os.path.join(os.getcwd(), "model_one/logs/hparam_tuning")

with tf.summary.create_file_writer(hp_logging_directory).as_default():
    hp.hparams_config(
    hparams=[HP_HIDDEN_UNITS, HP_EMBEDDING_DIM, HP_LEARNING_RATE, HP_DECAY_STEPS_MULTIPLIER],
    metrics=[hp.Metric(METRIC_ACCURACY, display_name='hamming_loss')],
  )
    
try:
    os.path.exists(hp_logging_directory)
    print("Directory of hyper parameters logging exists!")
except Exception as e:
    print(e)
    print("Directory not found!")

Calling the Tensorboard API

%tensorboard --logdir model_one/logs/hparam_tuning

enter image description here

Links I have seen:

I have also installed the TensorboardColab module

from tensorboardcolab import *

tbc = TensorBoardColab() # To create a tensorboardcolab object it will automatically creat a link
writer = tbc.get_writer() # To create a FileWriter
writer.add_graph(tf.get_default_graph()) # add the graph 
writer.flush()

Executing the above I get the following error: AttributeError: module 'tensorboard.summary._tf.summary' has no attribute 'FileWriter'

When I try to access the IP localhost:6006, I get the error that This site can’t be reached

Please check my colab notebook and kindly write in the comments if you miss any additional information that I might forget to include.

NikSp
  • 1,262
  • 2
  • 19
  • 42
  • @ParthShah in which part of the code add this command?...any hint? Thank you :)...Added on the end of the Notebook and got the following error ```AttributeError: module 'tensorflow' has no attribute 'disable_v2_behavior'``` – NikSp Jul 15 '20 at 18:26
  • Ditch this line `!pip install --quiet tensorboardcolab`. The cell you had an error is now runs. – Parth Shah Jul 15 '20 at 18:38
  • @ParthShah I find it difficult to follow up. I commented out this line you mentioned but I still get the same error about tensorboardcolab and the FileWriter()...Sorry I missed something from what you have written. If you don't mind join the colab notebook and comment on what you propose me to change :) – NikSp Jul 15 '20 at 18:41
  • 1
    Could you please refer this pull request [merged](https://github.com/tensorflow/tensorflow/pull/37223) and also the similar [issue](https://github.com/tensorflow/tensorflow/issues/37113), hope it helps.Thanks –  Oct 09 '21 at 01:07
  • @TensorflowSupport Thanks a lot for the comment. It's been a while since I asked the question. Hopefully, the links posted will help those who struggled with the issue before. Kudos. – NikSp Oct 09 '21 at 07:27

0 Answers0