9

I have trained the detection algorithm and saved my best model. Now I want to convert my model (pretrained) to C++ and use it in my app. I wanted to know what are the possible ways to convert a pyTorch model to c++?

Thanks!

Y0shimitsu
  • 171
  • 1
  • 2
  • 9

1 Answers1

15
  • You can use TorchScript intermediate representation of a PyTorch model, through tracing and scripting, that can be run in C++ environment. For this, you'll probably have to modify the model itself in order for it to be traced or scripted.

  • You can use ONNX (Open Neural Network Exchange), through which you can export your model and load it in another C++ framework such as Caffe. It comes with its own implications though.

  • The easiest is to try Embedding Python, through which you can run your python (pytorch) model in C++ environment. Note that the model will still run in python, but only through C++, so there won't be any speed gains that you might be expecting in C++.

Also, with the release of torchvision 0.5, all models in torchvision have native support for TorchScript and ONNX.

kHarshit
  • 11,362
  • 10
  • 52
  • 71
  • Thanks for the information! Do you know is there any way to use CUDA with c++ after converting the PyTorch mode to c++? – Y0shimitsu Jan 19 '20 at 22:59
  • 2
    Yes, torchscript support CUDA. You can simply use `model->to(at::kCUDA)` and `input.to(at::kCUDA)`. – kHarshit Jan 20 '20 at 05:13