1

I created a simple tensorflow model:

import tensorflow as tf

tf.reset_default_graph()

x_data = [1,2,3]
y_data = [3,4,5]

X = tf.placeholder(tf.float32, name="X")
Y = tf.placeholder(tf.float32, name="Y")

W = tf.Variable(tf.random_uniform([1], -1.0, 1.0), name='W')
b = tf.Variable(tf.random_uniform([1], 0.0, 2.0), name='b')

hypothesis = tf.add(b, tf.multiply(X,W), name="op_restore")

saver = tf.train.Saver()
cost = tf.reduce_mean(tf.square(hypothesis - Y))
optimizer = tf.train.GradientDescentOptimizer(learning_rate=0.1)

train_op = optimizer.minimize(cost)

with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    tf.train.write_graph(sess.graph_def, '.', 'tfandroid.pbtxt')

    for step in range(100):
       _, cost_val = sess.run([train_op, cost], feed_dict={X:x_data, Y:y_data})
       print((step, cost_val, sess.run(W), sess.run(b)))

    saver.save(sess, './tfandroid.ckpt')

    print("\n == Test ==")
    print("X: 5, Y: ", sess.run(hypothesis, feed_dict={X:5}))
    print("X: 2.5, Y: ", sess.run(hypothesis, feed_dict={X:2.5}))

However, the result is not the one that I expect. I expected '6' but I got 'None'.

Below is the re-use code, please can you tell me what's wrong with this code?

import tensorflow as tf

tf.reset_default_graph()

with tf.Session() as sess:
    saver = tf.train.import_meta_graph('tfandroid.ckpt.meta')
    saver.restore(sess, tf.train.latest_checkpoint('./'))
    graph = tf.get_default_graph()

    W = graph.get_tensor_by_name("W:0")
    b = graph.get_tensor_by_name("b:0")
    X = graph.get_tensor_by_name("X:0")

    print('sess.run(W) = ', sess.run(W))
    print('sess.run(b) = ', sess.run(b))

    feed_dict = {X: 4.0}

    hypothesis = graph.get_operation_by_name("op_restore")
    print(hypothesis)

    print(sess.run(hypothesis, feed_dict))
i_saw_drones
  • 3,486
  • 1
  • 31
  • 50
yunbum
  • 11
  • 2

1 Answers1

0

you should do the same for restore op as you are doing with W, b, c and you do need to restore w b if you don't need them, you just need to restore op_restore as following and and run it with feed_dict as you did in your codes

hypothesis = graph.get_operation_by_name("op_restore").output[0]
Eliethesaiyan
  • 2,327
  • 1
  • 22
  • 35
  • I already did that as you said, but I got an error like below. \n ValueError: Name 'op_restore:0' appears to refer to a Tensor, not a Operation. please help me. – yunbum Apr 17 '18 at 13:36
  • Thx so much Eliethesaiyan but still. not working..AttributeError: 'Operation' object has no attribute 'output' - by the way, isn't this so common way that save model and replace placeholder value and sess.run? – yunbum Apr 18 '18 at 04:55
  • here is a good question about restoring model, the idea is that the model can saved and used for later . i guess restoring it in the same graph might have some bad effects since variable names(especially placeholders) would be renamed https://stackoverflow.com/questions/33759623/tensorflow-how-to-save-restore-a-model – Eliethesaiyan Apr 18 '18 at 07:24
  • I solved problem. 'graph.get_tensor_by_name'. the link example was helpfult. Thx sir :) – yunbum Apr 18 '18 at 10:49