Converting YOLOX-s model to hef format

What changes should I make to convert the pretrained YOLOX-s model from YOLOX to hef format? I reused hailo_model_zoo/cfg/networks/yolox_s_wide_leaky.yaml and changed the end node names and then converted the model to hef format using the following command

hailomz compile --hw-arch hailo8l --yaml /hailo_model_zoo/cfg/networks/yolox_s.yaml --ckpt yolox_s.onnx --calib-path hailo_model_zoo/coco_dataset --performance

But running hailortcli run yolox_s.hef gave the following error.

Running streaming inference (…/yolox_s.hef):
Transform data: true
Type: auto
Quantized: true
[HailoRT] [error] CHECK failed - Failed to extract_detections, reg yolox_s_leaky/conv55_111 buffer_size should be 25600, but is 6400
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2)
Network yolox_s_leaky/yolox_s_leaky: 100% | 0 | FPS: 0.00 | ETA: 00:00:00
[HailoRT CLI] [error] Failed waiting for threads with status HAILO_INVALID_ARGUMENT(2)
[HailoRT CLI] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2)
[HailoRT CLI] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2)
[HailoRT CLI] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2)
[HailoRT CLI] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_ARGUMENT(2) - Error failed running inference

Hey @theprimetux

The error you encountered, where the buffer size does not match expectations, often points to issues with layer configurations or mismatches between the compiled model structure and Hailo hardware specifications. Here are steps to troubleshoot and potentially resolve the issue:

  1. Verify Layer Names and Structure:

    • Ensure that the layer names specified in your modified yolox_s_wide_leaky.yaml align with those in the yolox_s.onnx model. Changing the end node names may inadvertently affect the expected layer structure, especially with layers that influence buffer sizes.
    • Check that the layers like yolox_s_leaky/conv55_111 are correctly configured in terms of buffer sizes and channel output. This step may require examining the exact model layer definitions in yolox_s.onnx.
  2. Model Configuration Adjustments:

    • Review the yolox_s.yaml for any specific parameters related to buffer size or output configuration that might be missing in yolox_s_wide_leaky.yaml. Missing parameters could lead to configuration mismatches.
    • If yolox_s.yaml has specific layer definitions for Hailo targets, ensure these are incorporated in your modified yolox_s_wide_leaky.yaml.
  3. Calibration File Consistency:

    • Confirm that the calibration dataset path (--calib-path) is accurate and compatible with the model. Using coco_dataset is appropriate, but ensure it’s accessible and correctly referenced in your environment, as calibration affects layer output shapes and sizes.
  4. Inspect the Compilation Output for Warnings:

    • When running the hailomz compile command, observe any warnings or notifications. Warnings can sometimes indicate mismatches in layer configuration or alert you to expected buffer sizes that might not align with the hardware’s requirements.
  5. Testing Different Output Node Names:

    • If you have made changes to the end node names, try reverting to the original names and recompile the model. This test can clarify if the name change affects compatibility with hailortcli.

If these steps do not resolve the issue, please consider sharing additional details about any custom layer modifications made.