Custom Embedded Solution

Hello @omria , we discussed about it in a post i did but you never answer me back.. it would be very useful to have a guide on the process of writing post-processing functionalities for models that are not included in the examples.
Specifically, we need documentation on what the HAILO module returns after predictions are made, how to properly configure the JSON files, and how to write the code to generate a correct post-processing .so compiled file.
Without this information, only the models covered in the examples are ready to use. I couldn’t find any documentation on this topic, which makes it hard to properly schedule time for this task.
YOLO is fine for an MVP, but for production we need something free like MobileNet or other models.

Hey @Andrew92,

For some models — like YOLO and others — we actually have hailort-PP (post-processing) built directly into the model itself, so it runs on the Hailo device.
For custom C++ post-processing examples, you can check here:
:backhand_index_pointing_right: https://github.com/hailo-ai/hailo-apps-infra/tree/main/cpp


How post-processing works:

  • You’ll get tensor outputs based on how the model was compiled.
  • If you cut the model at a certain layer, you’ll need to manually run the remaining layers.
  • If the model is fully compiled, usually you just need to apply NMS (or a similar final step) on the output.

About guides:
We’re actively working on new guides — including one that covers how to configure the JSON files for post-processing!


For custom models:
If you can share what the output of your model looks like (shapes, expected outputs, etc.), I can help you set up the right config.

In general, your model_config.json would look something like this:

"post_processing": {
  "shared_object": "libmobilenet_postproc.so",
  "config": {
    "num_classes": 1000,
    "top_k": 5
  }
}
  • shared_object points to the .so file that handles post-processing.
  • config includes any parameters your post-processing needs.

Feel free to send your model’s output details or HAR or HEF when you’re ready — I’ll help you build the matching config or build post-process!

1 Like

Thanks! I m talking about the json where are specified the output layers, the anchors etc… i can t find any guide about that specify how to do that.

I see postprocessing for yolov but how do i do a postprocessing .cpp for example, mobilenet or others?

Oh, got it !

For those, we actually have default configs in the Model Zoo:

Also, here’s the full guide for the Dataflow Compiler (DFC):
:backhand_index_pointing_right: Dataflow Compiler Documentation (v3.31.0)

And for the Model Zoo and HailoMZ configs, you can check the docs here:
:backhand_index_pointing_right: Hailo Model Zoo Documentation


For Mobilenet with C++:
You’ll need to build the specific post-processing based on your model’s output.
Check the C++ examples I shared earlier — you’ll also find the compilation scripts in the same repo.
For example, here’s the one used for depth estimation:
:backhand_index_pointing_right: Depth Estimation C++ Post-Processing
This one is fully custom and can be used as a base of the operation you need based on your output nodes.


Quick Tip:
If you want fast help understanding .alls, .yaml, or post-processing config files:

  • Check the DFC and HailoMZ documentation on the Hailo website.
  • Honestly, one trick — give those docs to ChatGPT (or any assistant) and ask specific questions about the YAML, ALLS, or CONFIG files. There’s a lot of detailed info in the main documentation that can speed things up.
1 Like