Working Python inference example with .hef on Raspberry Pi (HailoRT 4.21.0, critical request)

Hi everyone,
I’m working with a Hailo-8 accelerator on Raspberry Pi 5 (64-bit, Python 3.11, HailoRT 4.21.0).
I’ve installed the official package:
hailort-4.21.0-cp311-cp311-linux_aarch64.whl from the Hailo website.

:brain: My goal:

  • Load my custom .hef model (YOLOv8n with 2 classes: OK / Not OK),
  • Perform inference on a single image (PNG or JPEG) using Python,
  • Retrieve the result (classification or detection) and print/display it.

:red_exclamation_mark:The problem:

  • High-level Python APIs like Inferer or VStreamsBuilder are missing in hailo_platform,
  • Even with Device and HEF, all attempts led to low-level complexity or errors,
  • I can’t find any working Python inference example for ARM64 (Raspberry Pi) in the documentation or community.

:police_car_light: Please help:

I really, really need a working example of Python inference from image using .hef — any minimal working script would help a lot.
Even a classification-only pipeline would be a great start.

:red_question_mark:Questions:

  1. Is there any official or community working Python example using .hef with hailort on Raspberry Pi?
  2. Is this currently supported in Python at all, or only in C++?
  3. If high-level API is not available in this wheel, are there alternatives?

Thank you so much in advance — I’ve already spent days on this, and any help or real code would be invaluable.

Welcome to the Hailo Community!

Did you have a look into our Application Example repository?

GitHub - Hailo Application Code Examples - Python

Hi @Artur_Korniiuk
At DeGirum we developed PySDK, a python package to simplify application development with Hailo devices. You can see installation instructions and examples at: DeGirum/hailo_examples: DeGirum PySDK with Hailo AI Accelerators. For your specific case, we have a user guide: User Guide 2: Running Your First Object Detection Model on a Hailo Device Using DeGirum PySDK. Please feel free to reach out if you need help getting things to work.

Hi, Ihave a problem 404 Client Error: Not Found for url: https://hub.degirum.com/zoo/v1/public/models/degirum/public//home/canacon/project/sorter/project/model_zoo/best.hef/info
but i set inference_host_address local and local=True, can you help me please?

Hi @Artur_Korniiuk
Can you please share your code snippet? it is n=mostly some typo in specifying model zoo path.

Hi @Artur_Korniiuk
Just checking if the issue still persists.

Hi, local host is done, thanks! but i have a new problem: I try to inference with yolo8n pretrained coco models on your user guide 2 with cat. but I get bad result

I think there are all of possible 80 classes.
my json

{
    "ConfigVersion": 10,
    "Checksum": "d5f1b2e5c1b7e1e8d9a4c3b2a1f0e9d8c7b6a5f4e3d2c1b0a9f8e7d6c5b4a3",
    "DEVICE": [
        {
            "DeviceType": "HAILO8",
            "RuntimeAgent": "HAILORT",
            "SupportedDeviceTypes": "HAILORT/HAILO8"
        }
    ],
    "PRE_PROCESS": [
        {
            "InputType": "Tensor",
            "InputN": 1,
            "InputH": 640,
            "InputW": 640,
            "InputC": 3,                
            "InputRawDataType": "DG_UINT8"
        }
    ],
    "MODEL_PARAMETERS": [
        {
            "ModelPath": "best1.hef"
        }
    ],
    "POST_PROCESS": [
        {
            "OutputPostprocessType": "None"
        }
    ]
}