12

I am trying to use tf-serving to deploy my torch model. I have exported my torch model to onnx. How could I generate the pb model for tf-serving ?

coin cheung
  • 949
  • 2
  • 10
  • 25
  • 1
    The question at https://stackoverflow.com/questions/53182177/how-do-you-convert-a-onnx-to-tflite/58576060#58576060 also has answer to your question. – Ahwar May 15 '20 at 12:09

1 Answers1

17

Use the onnx/onnx-tensorflow converter tool as a Tensorflow backend for ONNX.

  1. Install onnx-tensorflow: pip install onnx-tf

  2. Convert using the command line tool: onnx-tf convert -t tf -i /path/to/input.onnx -o /path/to/output.pb

Alternatively, you can convert through the python API.

import onnx

from onnx_tf.backend import prepare

onnx_model = onnx.load("input_path")  # load onnx model
tf_rep = prepare(onnx_model)  # prepare tf representation
tf_rep.export_graph("output_path")  # export the model
vini_s
  • 255
  • 2
  • 6
  • 4
    With the update to tensorflow 2 `onnx-tf` changed: `onnx-tf convert -i /path/to/input.onnx -o /path/to/output.pb` is now enough. You may want to install `onnx-tf` directly from the master if you run into issues – M.Winkens Sep 07 '20 at 07:48
  • 1
    Sorry. Doing this i get this error: OverflowError: Python int too large to convert to C long. Can you help me too? – Azazel Jan 03 '21 at 02:27