CHECK failed - HEF file length does not match (status = 26)

Hello guys,
I just tried the tutorials of ’ Using YOLOv8 Retraining Docker’.
URL: hailo-rpi5-examples/doc/retraining-example.md at main · hailo-ai/hailo-rpi5-examples · GitHub

All things are good except for uploading the hef files and tried the examples with re-trained hef model (managed by me)

the error message is just like this :

And, i couldn’t find the any reason from searching the issues on HAILO community.

please give some comments to solve this problem OTL

Thank you for reading.

(venv_hailo_rpi5_examples) jskang@raspberrypi:~/hailo-rpi5-examples $ python basic_pipelines/detection.py --hef resources/yolov8s.hef --input /dev/video0
hailomuxer name=hmux v4l2src device=/dev/video0 name=src_0 ! video/x-raw, width=640, height=480, framerate=30/1 ! queue name=queue_scale max-size-buffers=3 max-size-bytes=0 max-size-time=0 ! videoscale n-threads=2 ! queue name=queue_src_convert max-size-buffers=3 max-size-bytes=0 max-size-time=0 ! videoconvert n-threads=3 name=src_convert qos=false ! video/x-raw, format=RGB, width=640, height=640, pixel-aspect-ratio=1/1 ! tee name=t ! queue name=bypass_queue max-size-buffers=20 max-size-bytes=0 max-size-time=0 ! hmux.sink_0 t. ! queue name=queue_hailonet max-size-buffers=3 max-size-bytes=0 max-size-time=0 ! videoconvert n-threads=3 ! hailonet hef-path=resources/yolov8s.hef batch-size=2 nms-score-threshold=0.3 nms-iou-threshold=0.45 output-format-type=HAILO_FORMAT_TYPE_FLOAT32 force-writable=true ! queue name=queue_hailofilter max-size-buffers=3 max-size-bytes=0 max-size-time=0 ! hailofilter so-path=/home/jskang/hailo-rpi5-examples/basic_pipelines/…/resources/libyolo_hailortpp_post.so qos=false ! queue name=queue_hmuc max-size-buffers=3 max-size-bytes=0 max-size-time=0 ! hmux.sink_1 hmux. ! queue name=queue_hailo_python max-size-buffers=3 max-size-bytes=0 max-size-time=0 ! queue name=queue_user_callback max-size-buffers=3 max-size-bytes=0 max-size-time=0 ! identity name=identity_callback ! queue name=queue_hailooverlay max-size-buffers=3 max-size-bytes=0 max-size-time=0 ! hailooverlay ! queue name=queue_videoconvert max-size-buffers=3 max-size-bytes=0 max-size-time=0 ! videoconvert n-threads=3 qos=false ! queue name=queue_hailo_display max-size-buffers=3 max-size-bytes=0 max-size-time=0 ! fpsdisplaysink video-sink=xvimagesink name=hailo_display sync=false text-overlay=False signal-fps-measurements=true
[HailoRT] [error] CHECK failed - HEF file length does not match
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_HEF(26)
[HailoRT] [error] Failed parsing HEF file
[HailoRT] [error] Failed creating HEF
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_HEF(26)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_HEF(26)
CHECK_EXPECTED_AS_STATUS failed with status=26
Segmentation fault

Hi @happistday,
Can you please make sure that you’ve compiled the HEF using hw_arch=hailo8l?
You can validate it with this command:
hailortcli parse-hef <path-to-hef>

Welcome to the Hailo Community!

Just in case you tried renaming a compiled HAR file to HEF, that does not work. You will need the HEF file.
If you did not get a separate HEF file you can extract it from a compiled_har file.

Get the file information for the HAR file.

hailo har info model_compiled_model.har

Extract the HEF file.

hailo har extract --hef-path model.hef model_compiled_model.har

hello,

I am getting the same errors when trying to convert an onnx model to hef both with the CLI and python as illustrated in the tutorial (Dataflow Compiler v3.28.0)

The CLI commands I use:

hailo parser onnx --hw-arch hailo8l resnet_v1_18.onnx

hailo optimize --hw-arch hailo8l --calib-set-path resnet_v1_18_calib.npy resnet_v1_18.har 

hailo compiler --hw-arch hailo8l resnet_v1_18_optimized.har

(the onnx and npy files come from the tutorials package)

When I try

hailortcli parse-hef resnet_v1_18.hef

on the Raspberry Pi 5, I get the errors

[HailoRT] [error] CHECK failed - HEF file length does not match
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_HEF(26)
[HailoRT] [error] Failed parsing HEF file
[HailoRT] [error] Failed creating HEF
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_HEF(26)
[HailoRT CLI] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_HEF(26) - Failed to parse HEF

If instead I try to parse or run one of the installed hailo-models (e.g. yolov6n.hef), it runs fine.

Ok, the runtime package on the Pi have not yet been updated to latest, and the created HEF from 3.28 is newer than that.
On some releases we update some internal fields on the HEF proto, which will cause compatability issues.
What can you do?
Either use 3.27 for the time being (until we release the pi runtime SW, should be begining of Aug.) or wait for the new release.

1 Like

thanks @Nadav,
indeed with 3.27 it works

Thank you for your comments,
but unfortunately, both one (w. / w.o. hw configuration flag) are not working.

Thank you for your comments.

But, In case of my own,
i didn’t changed the anything from har files
the HEF files just managed from the command
“hailomz compile ~~”
Just i utilized the output HEF files, moved it to hailo device and tried it.

Same with my cases.

In my case, i just tried to follow the documents as like this

Models from URL1 (Error happens) : hailo rpi5 example githubs
(why link upload was not premitted…?)

Pre installed models from URL2 : Basif pipelines github with using retrained models

In my case, i just compile the models from *.pt to *.onnx and *.hef files (without translated it to *.har. complie procedure automatically doing thd optimization and har to hef translation works)

Thank you for your work sharing and let’s tried to solve this issues in this week!

Thank you for your comments!
today, after i tried it, i will share my results.

Congratulation!

Thank you for your sharing experience and
i hope that also my works doing fine.
i will share my results after i tried it.

Thank you for your wonderful comments.

To match the HDC version with other compatitable libraries, i re-setup the Hailo-ai-suit-sw with the 2024-04 versions,

and Thanks fully problems are solved!

Now it’s time to enjoy the hailo ai world.

Thank you so much and have a wonderful day :slight_smile:

For the hailoRT there is no 3.x.x version aslong i see is 1.xx and 4.xx is the package i addressed mistakenly ?

Welcome to the Hailo Community!

My colleague was referring to the version of the Hailo Dataflow Compiler. The version numbers for various components progress independently. The compatible versions of our tools are listed in the Hailo AI Software Suite User Guide.

Hailo Developer Zone - Documentation - Versions Compatibility

Hi. Is there a way to be notified when new release happen?

Tried several HEFs from GitHub
hailo-ai/hailo_model_zoo/blob/master/docs/public_models/HAILO8L/HAILO8L_object_detection.rst,
all throwing status=26 on RPi5

Seems because all “were compiled using Hailo Dataflow Compiler v3.28.0”

I too face this issue while trying to use ‘hand_landmark_lite.hef’ as it is compiled with v3.28.0. So I am trying to access the previous version here:
/ModelZoo/Compiled/v2.11.0/hailo8l/hand_landmark_lite.hef

However, the access is denied for v2.11.0 version alone. The page says:

AccessDenied

Access Denied

57YY0KF6G41QH876

MUJvxuvsVptHT+Q0PXUsDtxEMfhBWlrut9GZVUArTxtGs6+D1qDVflFEXDfKazAgBfY6dqie9jk=

To download models compiled with a previous MZ/DFC version, you can select previous tags in the repo. Here is the list of the models for MZ v2.11 (which means DFC v3.27.0)

You can find here the hand_landmark_lite model compiled with DFC v3.27.0, MZ 2.11.

Thanks for the tip.
Unfortunately I can see nothing in object_detection and depth_estimation.
Everything is empty on my side, except the hand_landmark_detection.

Hi,
I ran into the exact same problem, and although I managed to install DFC 3.27, I ran into this error instead:

Traceback (most recent call last):
  File "/local/workspace/hailo_virtualenv/bin/hailomz", line 33, in <module>
    sys.exit(load_entry_point('hailo-model-zoo', 'console_scripts', 'hailomz')())
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py", line 122, in main
    run(args)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py", line 111, in run
    return handlers[args.command](args)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 250, in compile
    _ensure_optimized(runner, logger, args, network_info)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 91, in _ensure_optimized
    optimize_model(
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/core/main_utils.py", line 319, in optimize_model
    optimize_full_precision_model(runner, calib_feed_callback, logger, model_script, resize, input_conversion, classes)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/core/main_utils.py", line 305, in optimize_full_precision_model
    runner.optimize_full_precision(calib_data=calib_feed_callback)
  File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_common/states/states.py", line 16, in wrapped_func
    return func(self, *args, **kwargs)
  File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/runner/client_runner.py", line 1597, in optimize_full_precision
    self._optimize_full_precision(calib_data=calib_data, data_type=data_type)
  File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/runner/client_runner.py", line 1600, in _optimize_full_precision
    self._sdk_backend.optimize_full_precision(calib_data=calib_data, data_type=data_type)
  File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py", line 1284, in optimize_full_precision
    model, params = self._apply_model_modification_commands(model, params, update_model_and_params)
  File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py", line 1200, in _apply_model_modification_commands
    model, params = command.apply(model, params, hw_consts=self.hw_arch.consts)
  File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/script_parser/nms_postprocess_command.py", line 323, in apply
    self._update_config_file(hailo_nn)
  File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/script_parser/nms_postprocess_command.py", line 456, in _update_config_file
    self._update_config_layers(hailo_nn)
  File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/script_parser/nms_postprocess_command.py", line 498, in _update_config_layers
    self._set_yolo_config_layers(hailo_nn)
  File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/script_parser/nms_postprocess_command.py", line 533, in _set_yolo_config_layers
    raise AllocatorScriptParserException("Cannot infer bbox conv layers automatically. "
hailo_sdk_client.sdk_backend.sdk_backend_exceptions.AllocatorScriptParserException: Cannot infer bbox conv layers automatically. Please specify the bbox layer in the json configuration file

Did I mess up the installation, or is there anything else that causes this problem?