I am pretty new to hailo, so far i learnt to use a .hef model on a rpi 5 with hailo 10H. Now I want to convert my own model from onnx to hef. I have read tutos and documentations (DFC doc) and I still have trouble. On a x86 machine with DFC i manage to parse, optimize and compile an hef model from my custom model (specifying the end_nodes), but I would like to know if it were possible to include the post processing into the hef model ?
I’ve read about the nms_post_processing in the .alls and the .json it needs but I don’t understand how to make it work and how it works.
Does anyone can enlight me ?
Thank you in advance.
Hi, yes - you can and should include the NMS post-processing in your HEF. It’s the recommended approach. When you parse your ONNX model, you cut the graph before the NMS/decode layers using end_node_names. Then you tell the DFC to re-attach an optimized NMS implementation via the nms_postprocess command in your .alls model script. This gets baked into the HEF so HailoRT handles it automatically at runtime.
I still have some questions about the example you gave me.
In yolov8s.alls, the change_output_activation(convXX, sigmoid) add a sigmoid after convXX ?
In the .json and .alls they use convXX but in my onnx the nodes are named differently, does it exist a way to see the end nodes names in the har file ?
This question is more generalist but is it possible to convert custom Neural Network ?
Thank you.
Yes, please check out the tutorials in the Hailo AI Software Suite Docker. Inside the Docker run the following command to start a Jupyter Notebook server with notebooks for each step of the workflow.
Not sure if this is a good place or not but, does anyone have a script for converting huggingface “Gemma 4”. I’ve managed a script for “Stable Diffusion” models. Has anyone developed scripts &/or processes for any of the conversions, followed by optimization and .hef?