1

I'm currently trying to get started with tensorflow. While doing so I encountered some problems regarding tensorboard. Some of them have already been fixed, but besides that, I'm still encountering the problem of tensorboard not showing anything when I'm accessing it with my tunnel connection.

I'm 100% certain that this has nothing to do with my tunnel structure due to the fact that tensorboard does show information when used with a high-level API like keras. I just think that my code below creates empty files and my attempt to save summaries is wrong. Could someone please clarify to me why tensorboard does not show any information?

n_epochs = 10
learning_rate = 0.01
batch_size = 100 
n_batches = int(np.ceil(m/ batch_size))

get_ipython().system_raw('tensorboard --logdir ./test_log --host 0.0.0.0 --port 6006')
get_ipython().system_raw('./ngrok http 6006 &')
! curl -s http://localhost:4040/api/tunnels | python3 -c \
"import sys, json; print(json.load(sys.stdin)['tunnels'][0]['public_url'])"

X = tf.placeholder(tf.float32, shape=(None, n + 1), name="X")
y = tf.placeholder(tf.float32, shape=(None, 1), name="y")
theta = tf.Variable(tf.random_uniform([n + 1, 1], -1.0, 1.0), name="theta")

yPredictions = tf.matmul(X, theta)
error = yPredictions - y
mse = tf.reduce_mean(tf.square(error), name="mse")

training_op = optimizer.minimize(mse)

mse_summary = tf.summary.scalar("MSE", mse)
file_writer = tf.summary.FileWriter("./test_log", 
tf.get_default_graph())

optimizer = tf.train.GradientDescentOptimizer(learning_rate=learning_rate)

init = tf.global_variables_initializer()

with tf.Session() as session:
  session.run(init)

  for epoch in range(n_epochs):
    for batch_index in range(n_batches):
      # Get batch data
      X_batch, y_batch = fetch_batch(epoch, batch_index, batch_size)

      #Add summary for tensorboard
      if batch_index % 10 == 0:
        summary_string = mse_summary.eval(feed_dict={X: X_batch, y: y_batch})
        step = epoch * n_batches + batch_index
        file_writer.add_summary(summary_string, step)

      session.run(training_op, feed_dict={X: X_batch, y: y_batch})
  file_writer.close()
  best_theta = theta.eval()
  • Have you looked at http://ischlag.github.io/2016/06/04/how-to-use-tensorboard/ and https://www.tensorflow.org/programmers_guide/summaries_and_tensorboard? – Colonder Jun 17 '18 at 00:03
  • Also: https://stackoverflow.com/questions/47818822/can-i-use-tensorboard-with-google-colab – Colonder Jun 17 '18 at 00:45
  • Yes, I did, but reading through my program over and over again while comparing my code to the exact StackOverflow question you mentioned did not result in any discoveries. The strange thing is that I'm not able to see a graph representation of my computation graph in tensorboard – Nouri Alexander Hilscher Jun 17 '18 at 09:56
  • Is the tfrecords file created by the file writer empty? Are you pointing tensorboard to the directory in which this file is? – Alexandre Passos Jun 18 '18 at 20:08
  • Yes, the file is definitely created and TensorBoard is pointed to it. My assumption is that I accidentally created an empty file due to an error in my computation graph. – Nouri Alexander Hilscher Jun 19 '18 at 17:48
  • Please have a look at these [colab notebook](https://colab.research.google.com/github/tensorflow/tensorboard/blob/master/docs/tensorboard_in_notebooks.ipynb) and [guide](https://www.tensorflow.org/guide) –  Oct 22 '21 at 01:42

0 Answers0