User Guide 2: Running Your First Object Detection Model on a Hailo Device Using DeGirum PySDK

Hi @Cameron_Ward
Please change HAILO8l to HAILO8L (small l to capital L) and HAILORT/HAILO8l to HAILORT/HAILO8L.

1 Like

That fixed it, thanks!

1 Like

@Cameron_Ward
Thanks for confirming. We will see if we can clearly call this out in our docs/guides so that future users do not face this issue. Since edits are not allowed, we need to think of a better way.

1 Like

Hi @shashi , actually I’m stuck while running multiple inferences on multiple models using the DeGirum Suite as the code that the DeGirum repo has provided uses their own compiled versions of the hef models but when I try to run the same script for multi-inference using my custom hef models which are basically custom trained yolov11n models, then I’m getting the following error:

Exception: Error detected during execution of AiSimpleGizmo:
  <class 'degirum.exceptions.DegirumException'>: Failed to perform model 'Person_Vehicle_CCTV' inference: [ERROR]Execution failed
Condition 'input_tensor->shape()[ 1 ] == 4 + m_OutputNumClasses' is not met: input_tensor->shape()[ 1 ] is 1, 4 + m_OutputNumClasses is 11
dg_postprocess_detection.cpp: 1612 [DG::DetectionPostprocessYoloV8::inputDataProcessBaseline]
When running model 'Person'

Error detected during execution of AiSimpleGizmo:
  <class 'degirum.exceptions.DegirumException'>: Failed to perform model 'fire_smoke_drone' inference: [ERROR]Execution failed
Condition 'input_tensor->shape()[ 1 ] == 4 + m_OutputNumClasses' is not met: input_tensor->shape()[ 1 ] is 1, 4 + m_OutputNumClasses is 11
dg_postprocess_detection.cpp: 1612 [DG::DetectionPostprocessYoloV8::inputDataProcessBaseline]
When running model 'fire_smoke'

The config of both my models look like the following:

{
    "Checksum": "dummy_sh",
    "ConfigVersion": 10,
    "DEVICE": [
        {
            "DeviceType": "HAILO8",
            "RuntimeAgent": "HAILORT",
            "SupportedDeviceTypes": "HAILORT/HAILO8",
            "ThreadPackSize": 6
        }
    ],
    "MODEL_PARAMETERS": [
        {
            "ModelPath": "Person.hef"
        }
    ],
    "POST_PROCESS": [
        {
            "LabelsPath": "labels_Person.json",
            "OutputNumClasses": 7,
            "OutputPostprocessType": "DetectionYoloV8"
        }
    ],
    "PRE_PROCESS": [
        {
            "InputN": 1,
            "InputH": 640,
            "InputW": 640,
            "InputC": 3,
            "InputQuantEn": true
        }
    ]
}

Is there any way I can compile my models in accordance with the DeGirum environment like the DeGirum’s model compiler or should I do something else because the models that I downloaded from their AI Hub are working perfectly for multi-inference ?

Hi @Nitheesh_Bopparaju
The postprocessor is configured incorrectly. Please see our guide: User Guide 3: Simplifying Object Detection on a Hailo Device Using DeGirum PySDK