4

I am using TensorRT in order to convert a model from onnx to trt -format. The model is originally a model from Tensorflow Model Zoo (SSD ResNet50). When I try to convert it I get the error:

[E] [TRT] /home/jenkins/agent/workspace/OSS/OSS_L0_MergeRequest/oss/parsers/onnx/ModelImporter.cpp:708: ERROR: /home/jenkins/agent/workspace/OSS/OSS_L0_MergeRequest/oss/parsers/onnx/builtin_op_importers.cpp:4298 In function importFallbackPluginImporter: [8] Assertion failed: creator && "Plugin not found, are the plugin name, version, and namespace correct?"
[E] Engine set up failed &&&& FAILED TensorRT.trtexec # trtexec --onnx=../model.onnx --fp16=enable --workspace=5500 --batch=1 --saveEngine=model_op11.trt --verbose

As far as I can tell it is looking for a plugin for the NonMaxSuppresion operation. Does anyone know how to convert a model from Tensorflow Model Zoo to TensorRT?

Innat
  • 16,113
  • 6
  • 53
  • 101
Araw
  • 2,410
  • 3
  • 29
  • 57
  • getPluginCreator could not find plugin is through the fallback path of the ONNX-TensorRT importer. What this means is that the default library doesn't support the NonMaxSuppression op. So until they update TensorRT to handle NonMaxSuppresion layers there is not a lot you can do.] – Atharva Gundawar Jun 25 '21 at 12:06
  • @AtharvaGundawar Custom plugins are supported, so it is possible to write it by yourself. But the problem is that I don't see any way to do this in Python. – Araw Jun 28 '21 at 17:18
  • If you haven't check this out : https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html#example1_add_custom_layer_python – Atharva Gundawar Jun 29 '21 at 02:39
  • Use `tf.experimental.tensorrt.Converter` , An offline converter for Tensorflow-TensorRT transformation for TF 2.0 SavedModels. For more details take a look at Tensorflow doc [here](https://www.tensorflow.org/api_docs/python/tf/experimental/tensorrt/Converter).Thanks! –  Jun 30 '21 at 02:28

1 Answers1

0

Got this fixed by using TensorRT 8.

Araw
  • 2,410
  • 3
  • 29
  • 57