Multi model setup on RPI5 Hailo AI kit for python api

Hello there,

I was testing my hailo ai kit on RPI5. I was able to test the setup using hailo run ‘model_name.hef’. But when I go beyond one model I get the error that no physical devices available to run. But this can be resolved using run2 on hailortcli but not on python hailo.

Also I could not find any documentation for python api where multiple models can be run on hailo chip, This is possible using run2 for testing but for deployment I need a proper setup. Is there anything I am missing in the docs.

Need some input on this.


Hey @hyperwolf,

To run multiple models on the Hailo-8, we use the scheduling interface. For each HEF file, we create a separate HailoAsyncInference class and run them using a scheduler.

Here’s an example in C++: Hailo Scheduler Example

And here’s the async object detection class in Python: Async Object Detection Class

By creating instances of this class for each HEF and running them with the HailoSchedulingAlgorithm of your choice, you can effectively run two or more models on the chip simultaneously.


Regards

By there any chance I am able to run both C++ and Python model at the same time?