1

Im using Tensorflow 1.13. However I get an error saying that I can't iterate through a tensor unless I'm in eager mode. Is there a way of doing this without going into eager mode?

with tf.Session(config=config) as sess:
    context = tf.placeholder(tf.int32, [args.batch_size, None])
    mask = tf.placeholder(tf.int32, [args.batch_size, 2])
    output = model.model(hparams=hparams, X=context)



    for batch_index in range(args.batch_size):
        start = mask[batch_index][0]
        end   = mask[batch_index][1]

        for i in range(start, end+1):
            output['logits'][batch_index, i , context[batch_index,i]].assign(math.inf)

    loss = tf.reduce_mean(
        tf.nn.sparse_softmax_cross_entropy_with_logits(
            labels=context[:, 1:],  logits=output['logits'][:, :-1]))
piccolo
  • 2,093
  • 3
  • 24
  • 56

1 Answers1

0

Can you try using tf.while_loop? You could try the following snippet (possibly with minor modifications for your code) and see whether it works?

import tensorflow as tf
gpu_options = tf.GPUOptions(per_process_gpu_memory_fraction=0.9)
with tf.Session(config=tf.ConfigProto(gpu_options=gpu_options)) as sess:
    context = tf.placeholder(tf.int32, [args.batch_size, None])
    mask = tf.placeholder(tf.int32, [args.batch_size, 2])
    output = model.model(hparams=hparams, X=context)



    for batch_index in [0,1,2,3]: #I have assumed a dummy list cz we can't iterate through a 'Dimension'
        start = mask[batch_index][0]
        end   = mask[batch_index][1]

        i = tf.constant(0)
        while_condition = lambda i: (tf.less(i, end)) & (tf.math.greater_equal(i,start))

        def body(i):
            return output['logits'][batch, i , context[batch,i]].assign(math.inf)

        r = tf.while_loop(while_condition, body, [i])

        # for i in range(start, end+1):
        #     output['logits'][batch, i , context[batch,i]].assign(math.inf)

    loss = tf.reduce_mean(
        tf.nn.sparse_softmax_cross_entropy_with_logits(
            labels=context[:, 1:],  logits=output['logits'][:, :-1]))
Achintha Ihalage
  • 2,310
  • 4
  • 20
  • 33
  • When I run `sess.run(loss, feed_dict={...})` how do I ensure that the `while_loop` is being executed? – piccolo Aug 19 '19 at 11:26
  • I hope that makes sense? – piccolo Aug 19 '19 at 12:34
  • Yes that makes sense. Can you try some sample inputs and run the session with `loss_r, r_r = sess.run([loss,r], feed_dict={...})`? And see whether you get the expected training performance? – Achintha Ihalage Aug 19 '19 at 12:48
  • This is a different error happens in the line `labels=context[:, 1:], logits=output['logits'][:, :-1]))` right? Have a look here https://stackoverflow.com/questions/39157723/how-to-do-slice-assignment-in-tensorflow. – Achintha Ihalage Aug 19 '19 at 17:00
  • It turns out the error comes from the `assign` part. See here: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/ops/array_ops.py#L978 – piccolo Aug 20 '19 at 09:21
  • I found someone else with a similar problem here : https://www.manongdao.com/q-716462.html – piccolo Aug 20 '19 at 09:39