Questions tagged [onnx]

ONNX is an open format to represent deep learning models and enable interoperability between different frameworks.

ONNX

The Open Neural Network Exchange (ONNX) is an open-source artificial intelligence ecosystem. With ONNX, AI developers can more easily move models between state-of-the-art tools and choose the combination that is best for them. ONNX is widely supported and can be found in many frameworks, tools, and hardware. It is developed and supported by a community of partners.

Official resources

809 questions
21
votes
4 answers

How do you convert a .onnx to tflite?

I've exported my model to ONNX via: # Export the model torch_out = torch.onnx._export(learn.model, # model being run x, # model input (or a tuple for multiple inputs) …
Suhail Doshi
  • 716
  • 1
  • 10
  • 23
12
votes
3 answers

How do you run a ONNX model on a GPU?

I'm trying to run an ONNX model import onnxruntime as ort import onnxruntime.backend model_path = "model.onnx" #https://microsoft.github.io/onnxruntime/ ort_sess = ort.InferenceSession(model_path) print( ort.get_device() ) This prints…
djacobs7
  • 11,357
  • 3
  • 25
  • 33
12
votes
1 answer

How could I convert onnx model to tensorflow saved model?

I am trying to use tf-serving to deploy my torch model. I have exported my torch model to onnx. How could I generate the pb model for tf-serving ?
coin cheung
  • 949
  • 2
  • 10
  • 25
12
votes
3 answers

Find input shape from onnx file

How can I find the input size of an onnx model? I would eventually like to script it from python. With tensorflow I can recover the graph definition, find input candidate nodes from it and then obtain their size. Can I do something similar with ONNX…
Nick Skywalker
  • 1,027
  • 2
  • 10
  • 26
9
votes
1 answer

PyTorch model to C++

I have trained the detection algorithm and saved my best model. Now I want to convert my model (pretrained) to C++ and use it in my app. I wanted to know what are the possible ways to convert a pyTorch model to c++? Thanks!
Y0shimitsu
  • 171
  • 1
  • 2
  • 9
9
votes
1 answer

Can't we run an onnx model imported to pytorch?

I have been trying to import a model from onnx format to work with pytorch. I am finding it difficult to get an example for the same. As most of the resources in Internet talks about exporting a pytorch model to onnx. I found that torch.onnx() can…
Ajai
  • 1,049
  • 1
  • 11
  • 23
8
votes
4 answers

Why CNN running in python is extremely slow in comparison to Matlab?

I have trained a CNN in Matlab 2019b that classifies images between three classes. When this CNN was tested in Matlab it was functioning fine and only took 10-15 seconds to classify an image. I used the exportONNXNetwork function in Maltab so that I…
user12909684
8
votes
2 answers

Espresso ANERuntimeEngine Program Inference overflow

I have two CoreML models. One works fine, and the other generates this error message: [espresso] [Espresso::ANERuntimeEngine::__forward_segment 0] evaluate[RealTime]WithModel returned 0; code=5 err=Error Domain=com.apple.appleneuralengine Code=5…
Stephen Furlani
  • 6,794
  • 4
  • 31
  • 60
7
votes
2 answers

Unexpected input data type. Actual: (tensor(double)) , expected: (tensor(float))

I am learning this new ONNX framework that allows us to deploy the deep learning (and others) model into production. However, there is one thing I am missing. I thought that the main reason for having such a framework is so that for inference…
Petr
  • 1,606
  • 2
  • 14
  • 39
7
votes
3 answers

What is the fastest Mask R-CNN implementation available

I'm running a Mask R-CNN model on an edge device (with an NVIDIA GTX 1080). I am currently using the Detectron2 Mask R-CNN implementation and I archieve an inference speed of around 5 FPS. To speed this up I looked at other inference engines and…
7
votes
1 answer

How to set environment variable TF_Keras = 1 for onnx conversion?

Recently updated to tensorflow 2.0 and am having trouble getting my .h5 models into .onnx . Used to be a very simple procedure but now I am having an issue. When I run the following code: # onnx testing import onnx import keras2onnx import…
jblonigan
  • 105
  • 1
  • 5
6
votes
2 answers

How to run a pytorch model on the browser?

I would like to know what are the options I have to run an object detection model in the browser, until now I have found the next options: Streamlit: is very simple but requires the server resources to run, including its own camera. Besides, I…
6
votes
2 answers

how to convert HuggingFace's Seq2seq models to onnx format

I am trying to convert the Pegasus newsroom in HuggingFace's transformers model to the ONNX format. I followed this guide published by Huggingface. After installing the prereqs, I ran this code: !rm -rf onnx/ from pathlib import Path from…
SlimeCity
  • 155
  • 1
  • 11
6
votes
3 answers

Darknet model to onnx

I am currently working with Darknet on Yolov4, with 1 class. I need to export those weights to onnx format, for tensorRT inference. I've tried multiple technics, using ultralytics to convert or going from tensorflow to onnx. But none seems to work.…
remc
  • 161
  • 1
  • 2
  • 10
6
votes
5 answers

Can't convert Pytorch to ONNX

Trying to convert this pytorch model with ONNX gives me this error. I've searched github and this error came up before in version 1.1.0 but was apparently rectified. Now I'm on torch 1.4.0. (python 3.6.9) and I see this error. File…
user10007342
1
2 3
53 54