hailo_sdk_client.tools.core_postprocess.nms_postprocess.NMSConfigPostprocessException: The layer yolov8n/conv41 doesn't have one output layer

After Running hailomz parse.

I have received the following error. Although it was working just a week ago.

It’s a simple script the same one I was using all the type just to convert .pt model to onnx and then to .hef

#!/bin/bash

Check if correct number of arguments are passed

if [ “$#” -ne 3 ]; then

echo “Usage: $0 <path_to_pt_file> <path_to_calib_folder> <output_onnx_filename>”

exit 1

fi

Set arguments

PT_FILE=$1

CALIB_PATH=$2

ONNX_FILE=$3

Define working directory

WORKING_DIR=“./”

HAR_FILE=“${WORKING_DIR}/yolov8s.har”

HAILO_MODEL_ZOO=“${WORKING_DIR}/hailo_model_zoo”

MODEL_SCRIPT=“${HAILO_MODEL_ZOO}/hailo_model_zoo/cfg/alls/generic/yolov8s.alls”

Step 1: Convert YOLO model to ONNX

echo “Exporting YOLO model to ONNX…”

yolo export model=${PT_FILE} format=onnx

Step 2: Parse the ONNX model using Hailo

echo “Parsing ONNX model for Hailo hardware…”

hailomz parse --hw-arch hailo8l --ckpt ${WORKING_DIR}/${ONNX_FILE} yolov8s

Step 3: Optimize the parsed model

echo “Optimizing the model…”

hailomz optimize --hw-arch hailo8l --calib-path ${WORKING_DIR}/${CALIB_PATH} --har ${HAR_FILE} --classes 1 yolov8s

Step 4: Compile the model for Hailo

echo “Compiling the model…”

hailomz compile --hw-arch hailo8l --model-script ${MODEL_SCRIPT} --calib-path ${WORKING_DIR}/${CALIB_PATH} --har ${HAR_FILE} --classes 1 yolov8s

echo “Process complete. Output saved as yolov8n.har”

oman@MSI:/mnt/c/Users/Roman/Desktop/Hailo_working$ hailomz compile --ckpt yolov8s.onnx --calib-path /mnt/c/Users/Roman/Desktop/Hailo_working/calib_plane --yaml /mnt/c/Users/Roman/Desktop/Hailo_working/hailo_model_zoo/hailo_model_zoo/cfg/networks/yolov8s.yaml --start-node-names ‘images’ --end-node-names ‘/model.22/cv2.0/cv2.0.2/Conv’ ‘/model.22/cv3.0/cv3.0.2/Conv’ ‘/model.22/cv2.1/cv2.1.2/Conv’ ‘/model.22/cv3.1/cv3.1.2/Conv’ ‘/model.22/cv2.2/cv2.2.2/Conv’ ‘/model.22/cv3.2/cv3.2.2/Conv’ --classes 1
Start run for network yolov8s …
Initializing the hailo8 runner…
[info] Translation started on ONNX model yolov8s
[info] Restored ONNX model yolov8s (completion time: 00:00:00.68)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.78)
[info] NMS structure of yolov8 (or equivalent architecture) was detected.
[info] In order to use HailoRT post-processing capabilities, these end node names should be used: /model.22/cv2.0/cv2.0.2/Conv /model.22/cv3.0/cv3.0.2/Conv /model.22/cv2.1/cv2.1.2/Conv /model.22/cv3.1/cv3.1.2/Conv /model.22/cv2.2/cv2.2.2/Conv /model.22/cv3.2/cv3.2.2/Conv.
[info] Start nodes mapped from original model: ‘images’: ‘yolov8s/input_layer1’.
[info] End nodes mapped from original model: ‘/model.22/cv2.0/cv2.0.2/Conv’, ‘/model.22/cv3.0/cv3.0.2/Conv’, ‘/model.22/cv2.1/cv2.1.2/Conv’, ‘/model.22/cv3.1/cv3.1.2/Conv’, ‘/model.22/cv2.2/cv2.2.2/Conv’, ‘/model.22/cv3.2/cv3.2.2/Conv’.
[info] Translation completed on ONNX model yolov8s (completion time: 00:00:01.20)

[info] Saved HAR to: /mnt/c/Users/Roman/Desktop/Hailo_working/yolov8s.har
Preparing calibration data…
[info] Loading model script commands to yolov8s from /mnt/c/Users/Roman/Desktop/Hailo_working/hailo_model_zoo/hailo_model_zoo/cfg/alls/generic/yolov8s.alls
[info] Loading model script commands to yolov8s from string
Traceback (most recent call last):
File “/home/roman/.local/bin/hailomz”, line 33, in
sys.exit(load_entry_point(‘hailo-model-zoo’, ‘console_scripts’, ‘hailomz’)())
File “/mnt/c/Users/Roman/Desktop/Hailo_working/hailo_model_zoo/hailo_model_zoo/main.py”, line 122, in main
run(args)
File “/mnt/c/Users/Roman/Desktop/Hailo_working/hailo_model_zoo/hailo_model_zoo/main.py”, line 111, in run
return handlersargs.command
File “/mnt/c/Users/Roman/Desktop/Hailo_working/hailo_model_zoo/hailo_model_zoo/main_driver.py”, line 250, in compile
_ensure_optimized(runner, logger, args, network_info)
File “/mnt/c/Users/Roman/Desktop/Hailo_working/hailo_model_zoo/hailo_model_zoo/main_driver.py”, line 91, in _ensure_optimized
optimize_model(
File “/mnt/c/Users/Roman/Desktop/Hailo_working/hailo_model_zoo/hailo_model_zoo/core/main_utils.py”, line 319, in optimize_model
optimize_full_precision_model(runner, calib_feed_callback, logger, model_script, resize, input_conversion, classes)
File “/mnt/c/Users/Roman/Desktop/Hailo_working/hailo_model_zoo/hailo_model_zoo/core/main_utils.py”, line 305, in optimize_full_precision_model
runner.optimize_full_precision(calib_data=calib_feed_callback)
File “/home/roman/.local/lib/python3.10/site-packages/hailo_sdk_common/states/states.py”, line 16, in wrapped_func
return func(self, *args, **kwargs)
File “/home/roman/.local/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py”, line 1996, in optimize_full_precision
self._optimize_full_precision(calib_data=calib_data, data_type=data_type)
File “/home/roman/.local/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py”, line 1999, in _optimize_full_precision
self._sdk_backend.optimize_full_precision(calib_data=calib_data, data_type=data_type)
File “/home/roman/.local/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py”, line 1483, in optimize_full_precision
model, params = self._apply_model_modification_commands(model, params, update_model_and_params)
File “/home/roman/.local/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py”, line 1374, in _apply_model_modification_commands
model, params = command.apply(model, params, hw_consts=self.hw_arch.consts)
File “/home/roman/.local/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/script_parser/nms_postprocess_command.py”, line 381, in apply
pp_creator = create_nms_postprocess(
File “/home/roman/.local/lib/python3.10/site-packages/hailo_sdk_client/tools/core_postprocess/nms_postprocess.py”, line 1661, in create_nms_postprocess
pp_creator.prepare_hn_and_weights(hw_consts, engine)
File “/home/roman/.local/lib/python3.10/site-packages/hailo_sdk_client/tools/core_postprocess/nms_postprocess.py”, line 1103, in prepare_hn_and_weights
super().prepare_hn_and_weights(hw_consts, engine, layers_to_sigmoid)
File “/home/roman/.local/lib/python3.10/site-packages/hailo_sdk_client/tools/core_postprocess/nms_postprocess.py”, line 1071, in prepare_hn_and_weights
self.add_postprocess_layer_to_hn()
File “/home/roman/.local/lib/python3.10/site-packages/hailo_sdk_client/tools/core_postprocess/nms_postprocess.py”, line 1022, in add_postprocess_layer_to_hn
raise NMSConfigPostprocessException(f"The layer {encoded_layer.name} doesn’t have one output layer")
hailo_sdk_client.tools.core_postprocess.nms_postprocess.NMSConfigPostprocessException: The layer yolov8s/conv41 doesn’t have one output layer

I am running datacompiler 3.28 and model zoo 2.12

All the other guides doesn’t answer on that problem so I don’t really know where to go with that.

Note I haven’t edited the model in any way.

what may be the issue and what do I do?

Hey @roman.karpenko ,

Welcome to the Hailo Community!

It seems like you’re encountering an NMSConfigPostprocessException related to the YOLOv8s/conv41 layer, where the error indicates that the layer doesn’t have exactly one output layer. This issue can occur due to changes in ONNX model structure, incompatibility between versions of tools, or an issue in the model parsing process.

Here are some steps to help you resolve the issue:


1. Verify the ONNX Model Structure Using Netron

  • Use Netron (https://netron.app) to visualize the structure of your ONNX model.
  • Look for the yolov8s/conv41 layer and confirm how many outputs it has.
    • If there are multiple outputs, it may indicate a change in the model structure.
    • Make sure the end nodes specified in your command align with the ONNX model structure.

2. Check for Model Zoo and SDK Compatibility

You mentioned you’re using Model Zoo 2.12 and Data Compiler 3.28. Ensure the versions are compatible by checking the release notes for both tools.

  • Try updating to the latest versions of both the Hailo Model Zoo and the SDK to rule out any version incompatibility.

3. Modify Your Parsing Command

Try re-running your parsing command with simplified node names. Ensure that the end node names are accurate based on your Netron inspection.

hailomz parse --hw-arch hailo8l --ckpt yolov8s.onnx yolov8s

Alternatively, ensure the --start-node-names and --end-node-names are correctly set, with accurate paths to the layers identified in Netron.


4. Use a Fresh Conversion Process

Since the model worked previously, it’s possible that the conversion or the ONNX model export has changed. Try re-exporting the model with:

yolo export model=path_to_yolov8.pt format=onnx

After re-exporting, re-run the parsing and optimization steps using the same commands you used earlier.


5. Bypass NMS Post-Processing for Debugging

As a workaround, you can try disabling Hailo NMS post-processing to isolate the issue. If your model requires post-processing, you can apply NMS manually after inference.

Here’s how you can modify your optimization or compile command to skip Hailo’s built-in NMS:

hailomz compile --hw-arch hailo8l --model-script your_model_script --no-nms

6. Use Debugging Logs for More Insight

Enable verbose logging to get more details about the error:

export HAILORT_LOG_LEVEL=debug

Then, re-run the process to capture more detailed logs and identify the specific issue with conv41.


Let me know if these steps help, or if the issue persists, I can assist further!

Best Regards,
Omri

Thank you very much omria!

I suspect that yolo when retraining your custom model again is doing some changes in the structure. But if we will take the default model (like pretrained on COCO128 yolo8n) and train it from the scratch on our data, it will successfully compile.

So, the solution here for yolo is just to use the default pretrained model and just retrain it from the scratch.