hailoRT nvidia emulation on ubuntu

is there any way to run a complete user application on ubuntu, using an Nvidia GPU to emulate a Hailo8?
The only info I found is how to run model zoo i.e. hailomz --target emulator, not HailoRT.

Hi @Thor,

As you mentioned, the hailomz eval tool is one option. Another option is to use the DFC:

from hailo_sdk_client import ClientRunner, InferenceContext

runner = ClientRunner(har=model_path)
with runner.infer_context(InferenceContext.SDK_FP_OPTIMIZED) as ctx:
    output = runner.infer(ctx, input_data_modified)

It’s important to note that the emulation should be used to check accuracy and not throughput/latency - it will be much slower than running on a Hailo device.

HailoRT is used for device inference, it’s not responsible for the emulation.

Hi @nina-vilela ,
thanks. A couple more questions.
1)
Ubuntu 22.04: I’ve managed to get to the end of the sw suite SelfExtractedExecutable pre-installation (a number of issues with install documentation but OK)
1a) However if I launch
$ ./hailo_ai_sw_suite_2024-10.run
no error messages but Tappa is not listed as installed
1b) If I launch
$ ./hailo_ai_sw_suite_2024-10.run – install-tappas
I get the following error
ERROR: pip’s dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
hailo-dataflow-compiler 3.29.0 requires setuptools==68.0.0, but you have setuptools 66.0.0 which is incompatible.

So I guess
1a) doesn’t install hailo-dataflow-compiler 3.29.0 and doesn’t install tappas
1b) has an issue with the setuptools version and doesn’t complete

Any help?

Hi @nina-vilela,
thanks.
I’m sure it’s a stupid question but I still haven’t got the time to go through all the documentation.
Le’t say I want to do a very simple thing
1a) take a simple example made for the RPI5

1b) install it in the hailo venv on my ubuntu machine, and use NVDIA emulator
1c) I guess I have to replace
hailo_inference = HailoAsyncInference(
hef_path=args.net,
input_queue=input_queue,
output_queue=output_queue,
)
with something (what)?

In which host?

Yes, you would need to replace anything related to HailoAsyncInference with the emulated inference like in the code I sent you previously.

or, say I’d like to run your example using emulation

I guess I have to replace
hailo_inference = HailoAsyncInference(
net_path, input_queue, output_queue, batch_size
)
with what?

Particularly for the detection_with_tracker example, it will be a bit tricky because the supervision package will cause issues with the DFC’s dependencies’ requirements.