Not able to convert the har file to hef

Hey there, so idk why…but I’ve been trying to convert this har file to hef, but everytime I do it, theres an error. The model zoo is installed, tappas, suite, everythings there, but I just dont know it doesnt work. I’ve attached the error below. Thanks!!

(hailo_venv) ubuntu@ubuntu-SM2025LT0011:~/Hailo8l/sources/model_zoo$ export HAILO_MODEL_ZOO_DATA_PATH=“/home/ubuntu/.hailomz/data/models_files/coco/2023-08-03”
hailomz compile yolov8n --ckpt /home/ubuntu/Hailo8l/model/runs/detect/train8/weights/best.onnx
Start run for network yolov8n …
Initializing the hailo8 runner…
[info] Translation started on ONNX model yolov8n
[info] Restored ONNX model yolov8n (completion time: 00:00:00.05)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.19)
[info] Simplified ONNX model for a parsing retry attempt (completion time: 00:00:00.57)
[info] According to recommendations, retrying parsing with end node names: [‘/model.22/Concat_3’].
[info] Translation started on ONNX model yolov8n
[info] Restored ONNX model yolov8n (completion time: 00:00:00.02)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.14)
[info] NMS structure of yolov8 (or equivalent architecture) was detected.
[info] In order to use HailoRT post-processing capabilities, these end node names should be used: /model.22/cv2.0/cv2.0.2/Conv /model.22/cv3.0/cv3.0.2/Conv /model.22/cv2.1/cv2.1.2/Conv /model.22/cv3.1/cv3.1.2/Conv /model.22/cv2.2/cv2.2.2/Conv /model.22/cv3.2/cv3.2.2/Conv.
[info] Start nodes mapped from original model: ‘images’: ‘yolov8n/input_layer1’.
[info] End nodes mapped from original model: ‘/model.22/Concat_3’.
[info] Translation completed on ONNX model yolov8n (completion time: 00:00:00.61)
[info] Translation started on ONNX model yolov8n
[info] Restored ONNX model yolov8n (completion time: 00:00:00.02)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.18)
[info] NMS structure of yolov8 (or equivalent architecture) was detected.
[info] In order to use HailoRT post-processing capabilities, these end node names should be used: /model.22/cv2.0/cv2.0.2/Conv /model.22/cv3.0/cv3.0.2/Conv /model.22/cv2.1/cv2.1.2/Conv /model.22/cv3.1/cv3.1.2/Conv /model.22/cv2.2/cv2.2.2/Conv /model.22/cv3.2/cv3.2.2/Conv.
[info] Start nodes mapped from original model: ‘images’: ‘yolov8n/input_layer1’.
[info] End nodes mapped from original model: ‘/model.22/cv2.0/cv2.0.2/Conv’, ‘/model.22/cv3.0/cv3.0.2/Conv’, ‘/model.22/cv2.1/cv2.1.2/Conv’, ‘/model.22/cv3.1/cv3.1.2/Conv’, ‘/model.22/cv2.2/cv2.2.2/Conv’, ‘/model.22/cv3.2/cv3.2.2/Conv’.
[info] Translation completed on ONNX model yolov8n (completion time: 00:00:00.63)
[info] Appending model script commands to yolov8n from string
[info] Added nms postprocess command to model script.
[info] Saved HAR to: /home/ubuntu/Hailo8l/sources/model_zoo/yolov8n.har
Preparing calibration data…
[info] Loading model script commands to yolov8n from /home/ubuntu/Hailo8l/sources/model_zoo/hailo_model_zoo/cfg/alls/generic/yolov8n.alls
[info] Starting Model Optimization
[warning] Reducing optimization level to 0 (the accuracy won’t be optimized and compression won’t be used) because there’s no available GPU
[warning] Running model optimization with zero level of optimization is not recommended for production use and might lead to suboptimal accuracy results
[info] Model received quantization params from the hn
Traceback (most recent call last):
File “/home/ubuntu/Hailo8l/hailo_venv/bin/hailomz”, line 33, in
sys.exit(load_entry_point(‘hailo-model-zoo’, ‘console_scripts’, ‘hailomz’)())
File “/home/ubuntu/Hailo8l/sources/model_zoo/hailo_model_zoo/main.py”, line 122, in main
run(args)
File “/home/ubuntu/Hailo8l/sources/model_zoo/hailo_model_zoo/main.py”, line 111, in run
return handlersargs.command
File “/home/ubuntu/Hailo8l/sources/model_zoo/hailo_model_zoo/main_driver.py”, line 250, in compile
runner = ensure_optimized(runner, logger, args, network_info)
File “/home/ubuntu/Hailo8l/sources/model_zoo/hailo_model_zoo/main_driver.py”, line 91, in ensure_optimized
optimize_model(
File “/home/ubuntu/Hailo8l/sources/model_zoo/hailo_model_zoo/core/main_utils.py”, line 353, in optimize_model
runner.optimize(calib_feed_callback)
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_sdk_common/states/states.py”, line 16, in wrapped_func
return func(self, *args, **kwargs)
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_sdk_client/runner/client_runner.py”, line 2128, in optimize
self.optimize(calib_data, data_type=data_type, work_dir=work_dir)
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_sdk_common/states/states.py”, line 16, in wrapped_func
return func(self, *args, **kwargs)
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_sdk_client/runner/client_runner.py”, line 1970, in optimize
self.sdk_backend.full_quantization(calib_data, data_type=data_type, work_dir=work_dir)
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py”, line 1125, in full_quantization
self.full_acceleras_run(self.calibration_data, data_type)
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py”, line 1319, in full_acceleras_run
optimization_flow.run()
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_model_optimization/tools/orchestator.py”, line 306, in wrapper
return func(self, *args, **kwargs)
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_model_optimization/flows/optimization_flow.py”, line 335, in run
step_func()
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_model_optimization/tools/orchestator.py”, line 250, in wrapped
result = method(*args, **kwargs)
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_model_optimization/tools/subprocess_wrapper.py”, line 123, in parent_wrapper
self.build_model()
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_model_optimization/tools/orchestator.py”, line 250, in wrapped
result = method(*args, **kwargs)
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_model_optimization/flows/optimization_flow.py”, line 249, in build_model
model.compute_output_shape(shapes)
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/model/hailo_model/hailo_model.py”, line 1091, in compute_output_shape
return self.compute_and_verify_output_shape(input_shape, verify_layer_inputs_shape=False)
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/model/hailo_model/hailo_model.py”, line 1125, in compute_and_verify_output_shape
layer_output_shape = layer.compute_output_shape(layer_input_shapes)
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/keras/engine/base_layer.py”, line 917, in compute_output_shape
outputs = self(inputs, training=False)
File “/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/keras/utils/traceback_utils.py”, line 70, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/tmp/autograph_generated_filezzbpsesg.py", line 41, in tf__call
outputs = ag
.converted_call(ag
.ld(self).call_core, (ag
.ld(inputs), ag
.ld(training)), dict(**ag
_.ld(kwargs)), fscope)
File "/tmp/autograph_generated_filexiac5vqe.py", line 90, in tf__call_core
ag
.if_stmt((ag__.ld(self).postprocess_type in [ag__.ld(PostprocessType).NMS, ag__.ld(PostprocessType).BBOX_DECODER]), if_body_3, else_body_3, get_state_3, set_state_3, (‘do_return’, ‘retval_’), 2)
File "/tmp/autograph_generated_filexiac5vqe.py", line 22, in if_body_3
retval
= ag
_.converted_call(ag__.ld(self).bbox_decoding_and_nms_call, (ag__.ld(inputs),), dict(is_bbox_decoding_only=(ag__.ld(self).postprocess_type == ag__.ld(PostprocessType).BBOX_DECODER)), fscope)
File "/tmp/autograph_generated_file83xsmus9.py", line 116, in tf__bbox_decoding_and_nms_call
ag
.if_stmt((ag__.ld(self).meta_arch in [ag__.ld(NMSOnCpuMetaArchitectures).YOLOV5, ag__.ld(NMSOnCpuMetaArchitectures).YOLOX]), if_body_5, else_body_5, get_state_5, set_state_5, (‘decoded_bboxes’, ‘detection_score’, ‘do_return’, ‘retval_’, ‘inputs’), 4)
File "/tmp/autograph_generated_file83xsmus9.py", line 113, in else_body_5
ag
.if_stmt((ag__.ld(self).meta_arch == ag__.ld(NMSOnCpuMetaArchitectures).YOLOV5_SEG), if_body_4, else_body_4, get_state_4, set_state_4, (‘decoded_bboxes’, ‘detection_score’, ‘do_return’, ‘retval_’), 4)
File "/tmp/autograph_generated_file83xsmus9.py", line 110, in else_body_4
ag
.if_stmt((ag__.ld(self).meta_arch == ag__.ld(NMSOnCpuMetaArchitectures).YOLOV8), if_body_3, else_body_3, get_state_3, set_state_3, (‘decoded_bboxes’, ‘detection_score’), 2)
File "/tmp/autograph_generated_file83xsmus9.py", line 69, in if_body_3
(decoded_bboxes, detection_score) = ag
.converted_call(ag__.ld(self).yolov8_decoding_call, (ag__.ld(inputs),), dict(offsets=[0.5, 0.5]), fscope)
File "/tmp/autograph_generated_filexl4eht0j.py", line 87, in tf__yolov8_decoding_call
decoded_bboxes = ag
.converted_call(ag__.ld(tf).expand_dims, (ag__.ld(decoded_bboxes),), dict(axis=2), fscope)
ValueError: Exception encountered when calling layer “yolov8_nms_postprocess” (type HailoPostprocess).

in user code:

File "/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/hailo_layers/base_hailo_none_nn_core_layer.py", line 45, in call  *
    outputs = self.call_core(inputs, training, **kwargs)
File "/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/hailo_layers/hailo_postprocess.py", line 122, in call_core  *
    inputs,
File "/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/hailo_layers/hailo_postprocess.py", line 157, in bbox_decoding_and_nms_call  *
    decoded_bboxes, detection_score = self.yolov8_decoding_call(inputs, offsets=[0.5, 0.5])
File "/home/ubuntu/Hailo8l/hailo_venv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/hailo_layers/hailo_postprocess.py", line 375, in yolov8_decoding_call  *
    decoded_bboxes = tf.expand_dims(decoded_bboxes, axis=2)

ValueError: Tried to convert 'input' to a tensor and failed. Error: None values not supported.

Call arguments received by layer “yolov8_nms_postprocess” (type HailoPostprocess):
• inputs=[‘tf.Tensor(shape=(None, 80, 80, 64), dtype=float32)’, ‘tf.Tensor(shape=(None, 80, 80, 1), dtype=float32)’, ‘tf.Tensor(shape=(None, 40, 40, 64), dtype=float32)’, ‘tf.Tensor(shape=(None, 40, 40, 1), dtype=float32)’, ‘tf.Tensor(shape=(None, 20, 20, 64), dtype=float32)’, ‘tf.Tensor(shape=(None, 20, 20, 1), dtype=float32)’]
• training=False
• kwargs=<class ‘inspect._empty’>
(hailo_venv) ubuntu@ubuntu-SM2025LT0011:~/Hailo8l/sources/model_zoo$

Hey @Shannen_Milton,

Your issue is likely due to missing the optimization step. We recommend following this complete process:

hailomz parse yolov8n --ckpt /path/to/best.onnx
hailomz optimize yolov8n --calib-path /path/to/calibration/data
hailomz compile yolov8n --har /path/to/yolov8n.har

The optimization step is crucial as it applies quantization and ensures hardware compatibility. Without it, the HAR file remains incomplete, causing HEF conversion to fail.

Common issues that might also affect conversion include dataset format mismatches (resulting in yolov8_nms_postprocess errors), incorrect output node mapping, and CUDA installation problems. If issues persist after adding the optimization step, try explicitly specifying output nodes with --end-node-names or verify that your calibration dataset has the correct format and dimensions for the model you’re working with.