Multithreading inference of two models on a single Hailo-8

Is there any way to perform multithreading inference of two models on a single Hailo-8? The inference must be executed simultaneously. Is there an example of C++ code?

@Richard.Zhu Should both models run inference on same source or does each model have its own source (for example two cameras)?

@Richard.Zhu
We developed a python SDK that makes working with Hailo devices easier: DeGirum/hailo_examples
We added an example on multithreading: multithreading notebook.
Please let us know if this is helpful.

In my case, each model has its own source.

That’s great, but how can I achieve this using C++? Is there hailort API ?

@Richard.Zhu
We currently only have python-based SDK that is a wrapper over HailoRT.

Okay, thanks for your reply.