How to perform inference on Hailo-8L using the pre-compiled hef from Model Explorer

My demand:

Perform inference on Hailo-8L using the hef binary provided by Hailo. I downloaded the hef from Model Explorer - models for hailo-8L and hailo-model-zoo - hailo-8l public models, but have no idea how to run these models (I guess different models have different dependencies, input, and usage).

What I did:

  1. Installed HailoRT, PCLe driver, and Tappas (I can already run the hailo-rpi5-examples
    demos, so the configurations should be correct, I guess).
  2. Follow application examples in Hailo-Application-Code-Examples, but failed as the instructions are only for Hailo-8.

Note:
object_detection supports specifying the hef file from the inference command, so I managed to perform inference on Hailo-8L by specifying a Hailo-8l hef in the inference command without modifying the python scripts.

Problems:

  1. Hailo provides quite a lot of 8L-specific compiled hef models on Model Explorer and hailo-model-zoo - hailo-8l public models, but I don’t find instructions for running these models.
  2. Hailo-Application-Code-Examples provided some Python scripts and readme instructions to run the compiled hef models, but unfortunately, they are all for Hailo-8. As a result, when I follow the instructions, it always returns an error saying that my model is Hailo-8L, not Hailo-8.

I need some instructions on how to run the compiled hef models on Hailo-8l. I really appreciate any help you can provide.

Welcome to the Hailo Community!

As long as the HEF file has the same input and output you should be able to run it in the same application.
Use the following command to view the HEF file.

hailortcli parse-hef model.hef

Example output

Architecture HEF was compiled for: HAILO8
Network group name: resnet_v1_18, Single Context
    Network name: resnet_v1_18/resnet_v1_18
        VStream infos:
            Input  resnet_v1_18/input_layer1 UINT8, NHWC(224x224x3)
            Output resnet_v1_18/fc1 UINT8, NC(1000)
Architecture HEF was compiled for: HAILO8L
Network group name: resnet_v1_18, Single Context
    Network name: resnet_v1_18/resnet_v1_18
        VStream infos:
            Input  resnet_v1_18/input_layer1 UINT8, NHWC(224x224x3)
            Output resnet_v1_18/fc1 UINT8, NC(1000)

In this case the only difference between these models is the architecture. For some models you will see a difference in the number of contexts.
You can run both HEFs on a Hailo-8. e.g.

hailortcli run model.hef

For the Hailo-8L variant you will get the following warning. The model will still run sucessfully.

[HailoRT] [warning] HEF was compiled for Hailo8L device, while the device itself is Hailo8. This will result in lower performance.

If you have a Hailo-8L module you will only be able to run the HEF file compiled for Hailo-8L. You will get an error from HailoRT when you try to run a HEF compiled for Hailo-8.

[HailoRT] [error] HEF format is not compatible with device. Device arch: Hailo8L, HEF arch: Hailo8

So, if you have an issue with a specific example please provide some more detailed information.