Is there a way to plot both training losses for two different networks being trained at the same time?
At the moment I use two FileWriter
and save the summaries to two different directories:
writer_cnn = tf.summary.FileWriter(os.path.join('log', 'cnn'))
writer_dann = tf.summary.FileWriter(os.path.join('log', 'dann'))
s_loss_cnn = tf.summary.scalar('loss_class', loss_class_cnn)
s_loss_dann = tf.summary.scalar('loss_class', loss_class_dann)
And later in the code:
s_cnn = sess.run(s_loss_cnn, feed_dict=feed_batch)
s_dann = sess.run(s_loss_dann, feed_dict=feed_batch)
writer_cnn.add_summary(s_cnn, global_step)
writer_dann.add_summary(s_dann, global_step)
But then when I fire TensorBoard I get two different graphs loss_class
and loss_class_1
. I've read at different places like here and there that creating two directories was the way to go. Am I missing something?