2

I have a saved simple LinearRegressor using saved_exportmodel like so:

feature_spec = create_feature_spec_for_parsing(feature_cols)
input_receiver_fn = build_parsing_serving_input_fn(feature_spec)
dnn.export_savedmodel('my_model/', serving_input_fn=input_receiver_fn)

Now, I want to load it and use it for predictions. I know I can use tensorflow_model_server but I don't want to use grpc. I've experimented with loading graph into session with tf.saved_model.loader.load but can't seem to make it work and it feels hacky at best. How do I go about it?

I'm using python 3.6 (one of the reasons why I don't want to use grpc) and tensorflow 1.2.

EDIT: For now I copied proto files and created my own client with Python 3.6, but still, I would like to have a way to restore SavedModel without a need to use tensorflow serving, so I'm leaving this question open.

cl0udburst
  • 53
  • 7
  • [This question](https://stackoverflow.com/questions/46098863/how-to-import-an-saved-tensorflow-model-train-using-tf-estimator-and-predict-on/46139198#46139198) shows how to do it if you use TF 1.3 or later with the new `tf.contrib.predictor` – stpk Nov 27 '17 at 10:11

0 Answers0