5

I'm trying to use tensorflow for study and i don't undestand how to open and use saved early in file my graph with type tf.Graph. Something like this:

import tensorflow as tf

my_graph = tf.Graph()

with g.as_default():
    x = tf.Variable(0)
    b = tf.constant(-5)
    k = tf.constant(2)

    y = k*x + b

tf.train.write_graph(my_graph, '.', 'graph.pbtxt')

f = open('graph.pbtxt', "r")

# Do something with "f" to get my saved graph and use it below in
# tf.Session(graph=...) instead of dots

with tf.Session(graph=...) as sess:
    tf.initialize_all_variables().run()

    y1 = sess.run(y, feed_dict={x: 5})
    y2 = sess.run(y, feed_dict={x: 10})
    print(y1, y2)
jdehesa
  • 58,456
  • 7
  • 77
  • 121
Sergey
  • 353
  • 1
  • 3
  • 12
  • I was looking for this. And this link helped me https://github.com/irfansharif/tensorflow/blob/master/converter.py Basically we have to use text_format.Merge from the google protobuf package to convert a pbtxt file to a graphdef file – Dheeraj Peri Mar 03 '18 at 01:14
  • Here is detailed example with latest tensorflow version 1.7 https://stackoverflow.com/a/52222383/5904928 – Aaditya Ura Sep 07 '18 at 12:19

3 Answers3

5

You have to load file contents, parse it to GraphDef and then import. It will be imported into current graph. You may want to wrap it with graph.as_default(): context manager.

import tensorflow as tf
from tensorflow.core.framework import graph_pb2 as gpb
from google.protobuf import text_format as pbtf

gdef = gpb.GraphDef()

with open('my-graph.pbtxt', 'r') as fh:
    graph_str = fh.read()

pbtf.Parse(graph_str, gdef)

tf.import_graph_def(gdef)
dm0_
  • 2,146
  • 1
  • 16
  • 22
  • 1
    ParseFromString takes a binary file per description [here](https://www.tensorflow.org/extend/tool_developers/), specifically: "The API itself can be a bit confusing - the binary call is actually ParseFromString(), whereas you use a utility function from the text_format module to load textual files." – RobR Apr 07 '17 at 17:16
0

One option: take a look at the Tensorflow MetaGraph save/restore support, documented here: https://www.tensorflow.org/versions/r0.11/how_tos/meta_graph/index.html

Peter Hawkins
  • 3,201
  • 19
  • 17
0

I solved this problem this way: first, i name needed calculation in my graph "output" and then save this model in code below...

import tensorflow as tf

x = tf.placeholder(dtype=tf.float64, shape=[], name="input")
a = tf.Variable(111, name="var1", dtype=tf.float64)
b = tf.Variable(-666, name="var2", dtype=tf.float64)

y = tf.add(x, a, name="output")

saver = tf.train.Saver()

with tf.Session() as sess:
    tf.initialize_all_variables().run()

    print(sess.run(y, feed_dict={x: 555}))

    save_path = saver.save(sess, "model.ckpt", meta_graph_suffix='meta', write_meta_graph=True)
    print("Model saved in file: %s" % save_path)

Second, I need to run certain operation in graph, which i know by name "output". So I just restore model in another code and run my restored calculation by taking necessary graph parts with names "input" and "output" :

import tensorflow as tf

# Restore graph to another graph (and make it default graph) and variables
graph = tf.Graph()
with graph.as_default():
    saver = tf.train.import_meta_graph("model.ckpt.meta")

    y = graph.get_tensor_by_name("output:0")
    x = graph.get_tensor_by_name("input:0")

    with tf.Session() as sess:

        saver.restore(sess, "model.ckpt")

        print(sess.run(y, feed_dict={x: 888}))

        # Variable out:
        for var in tf.all_variables():
            print("%s %.2f" % (var.name, var.eval()))
Sergey
  • 353
  • 1
  • 3
  • 12