0

Is there a way to plot both training losses for two different networks being trained at the same time?

At the moment I use two FileWriter and save the summaries to two different directories:

writer_cnn  = tf.summary.FileWriter(os.path.join('log', 'cnn'))
writer_dann = tf.summary.FileWriter(os.path.join('log', 'dann'))
s_loss_cnn  = tf.summary.scalar('loss_class', loss_class_cnn)
s_loss_dann = tf.summary.scalar('loss_class', loss_class_dann)

And later in the code:

s_cnn  = sess.run(s_loss_cnn, feed_dict=feed_batch)
s_dann = sess.run(s_loss_dann, feed_dict=feed_batch)
writer_cnn.add_summary(s_cnn, global_step)
writer_dann.add_summary(s_dann, global_step)

But then when I fire TensorBoard I get two different graphs loss_class and loss_class_1. I've read at different places like here and there that creating two directories was the way to go. Am I missing something?

Leo
  • 1,129
  • 4
  • 23
  • 38
  • 1
    In the examples you cite, the _same_ loss is written to two different directories. Here, you have _two different_ losses. – P-Gn Apr 21 '17 at 09:03
  • @user1735003 any idea how I could plot those two different losses in the same plot then? – Leo Apr 21 '17 at 10:07
  • https://github.com/tensorflow/tensorflow/issues/7089#issuecomment-280506195 mentions a way, does look a bit brittle though. – P-Gn Apr 21 '17 at 11:03

1 Answers1

0

You have not included your code, but I suspect that your problem is because you add all the operations to the same graph (default graph).

Try to create separate graphs and add them to your writer (graph parameter)

Something like this:

def graph1():
    g1 = tf.Graph()
    with g1.as_default() as g:
        # define your ops
    with tf.Session( graph = g ) as sess:
        # do all the stuff and write the writer

Create a similar function graph2() and then invoke them.

Salvador Dali
  • 214,103
  • 147
  • 703
  • 753
  • Correct me if I'm wrong but this would imply that I don't train both network at the same time anymore no? I would have `with tf.Session(graph=g1) as sess: #do first training` then `with tf.Session(graph=g2) as sess: #do second training` ? – Leo Apr 21 '17 at 08:26
  • @Leo yes, you are right. I somehow missed that part. But if you will add things to the same default graph, TB will show this graph no matter in which directly you are going to write it – Salvador Dali Apr 21 '17 at 08:45
  • Do you mean the actual graph representation? Because I don't really care about it. Although, what I get from what you say is that if I put the two summaries with the same name in the same graph then there is no way to plot both scalars as TF will rename the second variable. Is this correct? And is there a workaround? (Except splitting the graphs and thus the sessions.) – Leo Apr 21 '17 at 08:57