6

I am currently working with Darknet on Yolov4, with 1 class.

I need to export those weights to onnx format, for tensorRT inference. I've tried multiple technics, using ultralytics to convert or going from tensorflow to onnx. But none seems to work. Is there a direct way to do it?

remc
  • 161
  • 1
  • 2
  • 10

3 Answers3

5

Check this GitHub repo: https://github.com/Tianxiaomo/pytorch-YOLOv4

Running the demo_darknet2onnx.py script you'll be able to generate the ONNX model from the .cfg and .weights darknet files.

Usage example:

python demo_darknet2onnx.py <cfgFile> <weightFile> <imageFile> <batchSize>

You can also decide the batch size for the inference calls of the converted model.

jklemmack
  • 3,518
  • 3
  • 30
  • 56
Renan Vilas Novas
  • 1,210
  • 1
  • 10
  • 22
2

The following repo exports yolov3 models from darknet to onnx, for tensorRT inference. You can use this as reference for your model.

https://github.com/jkjung-avt/tensorrt_demos/tree/master/yolo

Asmita Khaneja
  • 355
  • 1
  • 8
0

You can convert scaled YOLO-yolov4,yolov4-csp.yolov4x-mish,yolov4-P5 etc models into onxx & its perfectly work fine.

https://github.com/linghu8812/tensorrt_inference

Akash Desai
  • 498
  • 5
  • 11