Cannot compile any darknet YOLO models - even Hailo's own license plate detector

I am trying to retrain a custom darknet YOLO model, and have been unable to convert multiple models completely to a .hef file. The reason myself, and many others are trying to use darknet models such as YOLOv4 is because they are not licensed by Ultralytics (unlike YOLOv8), and have much open-source MIT license.

The way I have been converting my models is by following these instructions to go from training a darknet model, then converting from darknet-> onnx using modelzoo dockers. I have glossed over some steps I had to take to edit my darknet .cfg files to fit the network sizes I want for retraining.

Then, I start the Hailo AI Software Suite docker, copy the onnx model to the SW Suite docker shared space, and run the hailo tutorial DFC_1_Parsing_Tutorial.ipynb Jupuyter notebook to convert my newly created ONNX model from onnx->har->hex. I have been unable to fully convert multiple models from input to output, because I get this error:

 UnsupportedShuffleLayerError in op Reshape_756: Failed to determine type of layer to create in node Reshape_756

Going through this thread and this thread, it’s clear that certain reshape nodes used as post-processing steps in darknet YOLO models are not supported by Hailo.

Models I have tried that failed:

  1. Converted my own pretrained, custom model from darknet->onnx->har->hef. I had to exclude parts of my model after the reshape nodes
  2. Retrained yolov4-tiny-3l from inside the model zoo docker, converted from darknet->onnx->har->hef, and got the same reshape error that requires excluding post-processing nodes
  3. As a sanity check, I took Hailo’s own license_plate_detecton yolo model from here, but I did NOT retrain. Instead I took this supposedly working model from darknet->onnx->har->hef, but I still got the same reshape errors.

I am very confused about how the Hailo team has produced the license_plate_detector YOLO model that works on a Hailo8, because I have not been able to produce a working .hef file.

Any help on retraining non-Ultralytics models for custom classes would be greatly appreciated because as it stands I am unable to use Hailo for my project but I am excited about the posibility of using it.

1 Like

You are right, for some models the post-processing has to be excluded when compiling a model for the Hailo device. It means that only the model itself will be offloaded to the Hailo accelerator, while the post processing will run on the host CPU.

In some cases, the post processing implementation for the CPU is included in HailoRT and done under the hood. In other cases, you have to implement the post processing as part of your application.

If you look for models with a permissive license, I can propose the following model families from the Hailo Model Zoo:

  • Yolox - has a permissive license, and its post processing is included in HailoRT.
  • Damoyolo - has a permissive license and a good accuracy/throughput trade-off, but its post processing is not yet included in HailoRT.
1 Like

I’m also trying to do the same thing using the darknet YOLOv4-tiny model. What needs to happen to support this model?

@Dhiren_Wijesinghe and @Gtech
If you share the onnx file, we can try to help.

Many thanks @shashi

I’ve created a github repo containing the Yolov4-tiny base model from HankAI using the default weights and converted the model into onnx format.

I’ve used the GitHub - james77777778/darknet-onnx: ONNX deployment of darknet (YOLOv3/YOLOv4) repo to convert the model

Weights: Where to find the MSCOCO weights? · Issue #21 · hank-ai/darknet · GitHub

Config: darknet/cfg/yolov4-tiny.cfg at master · hank-ai/darknet · GitHub

Conversion command:

python3 main.py --cfg cfg/yolov4-tiny.cfg --weight weights/yolov4-tiny.weights --img data/dog.jpg --names data/coco.names

ONNX File: yolov4-darknet-hailo/model.onnx at main · garbit/yolov4-darknet-hailo · GitHub

I have just used damoyolo and converted it to onnx. Does that mean it is impossible for now to convert it to .hef for hailo8 usage?

Hi shashi, here is my onnx: yolov4-tiny-3l-352_256.onnx - Google Drive

@Dhiren_Wijesinghe
What was your model trained to detect? Can you also provide some test/validation images for calibration if the original dataset is not coco?

Hi @Gtech
We were able to compile and test the model you provided successfully. You can download the model from our AI Hub (as well as test in the browser itself) at: DeGirum AI Hub. You need to register for our AI Hub. You can see our usage examples at: DeGirum/hailo_examples: DeGirum PySDK with Hailo AI Accelerators

1 Like

Morning @shashi - do you have guidance on how to convert from the model I provided in onnx format into the required hef format for the Hailo 8L AI hat?

Hi @Gtech
You should use the last two convolutional layers as end nodes. You also need to have a postprocessor to interpret the outputs properly.

I see that this topic comes up countless times, however there is no response from Hailo that really helps solve the problem.

Apologies Shashi, we’re wandering into territory that I have limited experience in. I’m not quite sure how to action your suggestion. Could you share some of the code from your experiements that you’ve used to get this running?

Hi @Gtech
This is definitely a complex topic with several potential challenges along the way. To help narrow things down, it would be good to know where exactly you are facing issues:

  • Have you successfully compiled other models before and are now running into trouble with this one?
  • Or is this your first time compiling a model to HEF?

If it’s your first time, I’d recommend checking out some helpful guides:

Model-specific challenges are also common, particularly with post-processing the outputs from the HEF file to interpret bounding boxes, confidence scores, and other details. In some cases, this logic is handled by the Hailo device itself; in others, custom code is needed. For popular models, we do have user guides that simplify this process.

For more specific or less common models, the best we can do is to provide a pre-compiled model with the necessary assets (if we have experience with it). This typically results in working code, but if you want to get things working on your end, deeper debugging may be required on your end.

Hope this helps point you in the right direction!

Hi @shashi, can I email you some training data?

Re damoyolo, it is possible to covert it to HEF and run it on Hailo-8. It is actually part of the Hailo Model Zoo. My comment was only about the post processing, which will not be offloaded to the Hailo accelerator or runtime library, so it needs to run on the host CPU as a part of the application code, either using ONNX Runtime, or by implementing it in C++ (or in any other way).

Hi @Dhiren_Wijesinghe
we were able to compile and run your model using our PySDK. Still waiting for you to provide sample images as we do not know what the model is supposed to detect. I sent you a direct message with my email so that you can share the data.

Instead of trying to use the Darknet framework should I use the base models provided in the Hailo model zoo instead and retrain these on a custom dataset?
hailo_model_zoo
I can see there is an MIT license on the repo and I assume any fine tuned model can be use for commercial purposes?

Hi @Gtech
Even if you use base models from the Hailo model zoo, the compile process and postprocessing will be the same. I am still not sure where you are facing issues. In my previous response, we already provided the compiled model with the proper postprocessor attached. Were you unable to run the model we provided?

Hi @shashi, I’d also like to hop on here and ask if the Hailo team is able to compile this onnx file? I wrote up my efforts in this thread (Dataflow compiler parsing failure - incorrectly reshapes array) and this one (Providing an onnx file to the Hailo team which fails to compile on DFC?) previously.

Here is a link to the ONNX file: https://drive.google.com/file/d/1AfERIrhnTEQkOdG2VuuzKY1HGK9upKAd/view?usp=sharing