1

I trained a Resnet model in torch. Then, I converted it to caffe and to tflite. now I want to convert it to onnx. How can I do it? I try that command:

python3 -m tf2onnx.convert --tflite  resnet.lite --output resnet.lite.onnx --opset 13 --verbose

because the current format of the model is tflite,

and got that error:

return packer_type.unpack_from(memoryview_type(buf), head)[0]
struct.error: unpack_from requires a buffer of at least 11202612 bytes for unpacking 4 bytes at offset 11202608 (actual buffer size is 2408448)

Thanks.

Shirly
  • 38
  • 6
  • 1
    You can directly convert the model from PyTorch to onnx. Pytorch gives support for the same using `torch.onnx.export` [link](https://pytorch.org/tutorials/advanced/super_resolution_with_onnxruntime.html) – Hiren Namera Jun 08 '22 at 06:53
  • Thanks @HirenNamera . I can not convert the model like this? When in TF format? – Shirly Jun 09 '22 at 07:50
  • but you have pytorch model also with you right? which you had trained using pytorch you can use that model to convert to onnx. – Hiren Namera Jun 09 '22 at 08:24
  • No. I have torch model. it is possible? – Shirly Jun 12 '22 at 05:30

2 Answers2

0

you can try something like this checkout link may be you need to freeze the model layers before starting conversion.

pip install onnxruntime
pip install git+https://github.com/onnx/tensorflow-onnx
python -m tf2onnx.convert --saved-model ./checkpoints/yolov4.tf --output model.onnx --opset 11 --verbose

you can try this one also link

pip install tf2onnx 
import tensorflow as tf
import tf2onnx
import onnx

model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(4, activation="relu"))

input_signature = [tf.TensorSpec([3, 3], tf.float32, name='x')]
# Use from_function for tf functions
onnx_model, _ = tf2onnx.convert.from_keras(model, input_signature, opset=13)
onnx.save(onnx_model, "dst/path/model.onnx")
Hiren Namera
  • 390
  • 1
  • 10
  • Thanks @Hiren Namera , but I don't ask about converting tf model, I already know that, I asking about a slightly more complex case. If you can answer what I asked - I would be happy. Good Day:) – Shirly Jun 12 '22 at 04:57
  • Why are you converting tflite to onnx, you can convert tf or torch to onnx with int32 or int8. that will be the same as onnx convertion. – Hiren Namera Jun 12 '22 at 06:01
  • OK, I still have the same question- how do you convert torch model or torch model that I converted to tf, into onnx model? – Shirly Jun 12 '22 at 06:13
  • Check this answer [link](https://stackoverflow.com/a/72557030/12635565) – Hiren Namera Jun 12 '22 at 06:17
  • You can check [link](https://pytorch.org/tutorials/advanced/super_resolution_with_onnxruntime.html) and [link2](https://pytorch.org/tutorials/advanced/super_resolution_with_onnxruntime.html) – Hiren Namera Jun 12 '22 at 06:18
  • You give me pytorch. I need torch. Its not the same – Shirly Jun 12 '22 at 06:57
  • both are same send me model i will send script for convertion. – Hiren Namera Jun 12 '22 at 08:21
  • Hi @Hiren Namera, I found the bug, that was problem with the model file that not copied well:( :(. After solved it, the error I mentioned in my question was disappear. Thank you very much. – Shirly Jun 12 '22 at 09:19
0

You should try check your model file, maybe you have a wrong file, and that error is because of that. try copy / download again the files and then try the tf conversion:

python3 -m tf2onnx.convert --tflite  resnet.lite --output resnet.lite.onnx --opset 13 --verbose
Shirly
  • 38
  • 6