About DFC and hailo_zoo_model

I finished installing the hailo model zoo。
use command :
hailomz compile --ckpt /root/model_onnx/yolov5s.onnx --calib-path /root/model_onnx/images --yaml /root/DFC_hailo/hailo_model_zoo/hailo_model_zoo/cfg/networks/yolov5s.yaml --start-node-names ‘start-name1’ ‘name2’ --end-node-names ‘name1’ --classes 80
error info:
Start run for network yolov5s …
Initializing the hailo8 runner…
[info] Translation started on ONNX model yolov5s
[info] Restored ONNX model yolov5s (completion time: 00:00:00.09)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.50)
[info] Simplified ONNX model for a parsing retry attempt (completion time: 00:00:01.22)
Traceback (most recent call last):
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 220, in translate_onnx_model
parsing_results = self._parse_onnx_model_to_hn(
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 300, in _parse_onnx_model_to_hn
return self.parse_model_to_hn(
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 340, in parse_model_to_hn
converter = ONNXConverter(
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_translator.py”, line 170, in init
super().init(
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py”, line 27, in init
super().init(graph, start_node_names, end_node_names)
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/model_translator/translator.py”, line 51, in init
self._calculate_valid_subgraph_scope()
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/model_translator/translator.py”, line 388, in _calculate_valid_subgraph_scope
current_vertex.in_valid_subgraph = True
AttributeError: ‘NoneType’ object has no attribute ‘in_valid_subgraph’

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “/root/pyhailo/bin/hailomz”, line 33, in
sys.exit(load_entry_point(‘hailo-model-zoo’, ‘console_scripts’, ‘hailomz’)())
File “/root/DFC_hailo/hailo_model_zoo/hailo_model_zoo/main.py”, line 122, in main
run(args)
File “/root/DFC_hailo/hailo_model_zoo/hailo_model_zoo/main.py”, line 111, in run
return handlersargs.command
File “/root/DFC_hailo/hailo_model_zoo/hailo_model_zoo/main_driver.py”, line 250, in compile
_ensure_optimized(runner, logger, args, network_info)
File “/root/DFC_hailo/hailo_model_zoo/hailo_model_zoo/main_driver.py”, line 73, in _ensure_optimized
_ensure_parsed(runner, logger, network_info, args)
File “/root/DFC_hailo/hailo_model_zoo/hailo_model_zoo/main_driver.py”, line 108, in _ensure_parsed
parse_model(runner, network_info, ckpt_path=args.ckpt_path, results_dir=args.results_dir, logger=logger)
File “/root/DFC_hailo/hailo_model_zoo/hailo_model_zoo/core/main_utils.py”, line 97, in parse_model
model_name = translate_model(runner, network_info, ckpt_path, tensor_shapes=start_node_shapes)
File “/root/DFC_hailo/hailo_model_zoo/hailo_model_zoo/utils/parse_utils.py”, line 28, in translate_model
runner.translate_onnx_model(
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_common/states/states.py”, line 16, in wrapped_func
return func(self, *args, **kwargs)
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/runner/client_runner.py”, line 1158, in translate_onnx_model
parser.translate_onnx_model(
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 260, in translate_onnx_model
parsing_results = self._parse_onnx_model_to_hn(
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 300, in _parse_onnx_model_to_hn
return self.parse_model_to_hn(
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 340, in parse_model_to_hn
converter = ONNXConverter(
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_translator.py”, line 170, in init
super().init(
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py”, line 27, in init
super().init(graph, start_node_names, end_node_names)
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/model_translator/translator.py”, line 51, in init
self._calculate_valid_subgraph_scope()
File “/root/pyhailo/lib/python3.8/site-packages/hailo_sdk_client/model_translator/translator.py”, line 388, in _calculate_valid_subgraph_scope
current_vertex.in_valid_subgraph = True
AttributeError: ‘NoneType’ object has no attribute ‘in_valid_subgraph’

I have traced the relevant code according to the error, but it has not been solved, because it is a command line, and it cannot be debugged.

Hi Kevin,
Are these the actual start and end nodes that you’re using? If this is the case, it might be the problem

I know it’s an optional.So I tried not to use these two parameters,The same kind of problem can be encountered

Can you share a snippet in netron of the end/start nodes?

I used the yolov5 official Pt-to-ONNX model

I think I am not clear about the onnx to hef format process.
Now,I have installed DFC and hailo_model_zoo in the same environment. wsl ubuntu22.04
I didn’t get the yolov5s.pt at the hailo model zoo
The pt I got from yolov5 official uses export.py to export onnx
I want to convert to hef using onnx, what should I do!
I need a complete process. Thanks!

@KNN2157 if you would like to use the pre-trained yolov5, we would recommend downloading it already compiled from our ModelZoo.

Otherwise, if you would like to compile a re-trained model, we recommend using our re-training docker.

Ok, I get it. I’ll try the compiled from hailo ModelZoo.

Is there any difference between using re-training docker and using the official version of export directly? Does it mean that if I use the official version of PT-ONNX, onnx can be compiled only after parsing, optimization, profiling and other operations.I got the information from here hailo_model_zoo-GETTING_STARTED

but I want to realize the process of converting onnx to hef. First of all, I can complete the training part by myself, so I want to know what I should do when I complete the training and get hef. I currently use this command—— hailomz compile - CKPT yolov5s. Onnx - calib - path/path/to/calibration/imgs/dir / - yaml path/to/yolov5s yaml --classes 80, but he would make mistakes, and I tried to figure out why. What is the difference between having to go through docker retrain and having to go through official training myself (I can be sure that I can still get pt normal use by training myself)?

We use a specific ultralytics tag for our re-training docker. Since we haven’t tested newer tags, we can’t guarantee everything will be exactly the same.

The error you originally saw seems related to incorrect information passed to hailomz compile. It could be the start and end nodes, for example. Unless the layer names in your model version are different from the layer names in our version, there’s no need to pass those arguments, as the ModelZoo already takes those names automatically from its configuration files.

Okay, thanks a lot. Now I can successfully convert using your model_zoo_onnx! This is great news. I have two more questions:

  1. If I train YOLOv5 with my own dataset, export it to ONNX, and then convert it to HEF format, do I have to go through the process of parsing → optimization → Parsing → compilation to get the HEF, or can I directly use hailomz for compilation?
  2. For the parameter file —calibrb -path /path/to/calibration/imgs/dir/, do the images need to be the training images, or can they just be images of the same specifications? How many such images are usually required?

Additionally, when I run the newly generated HEF on Raspberry Pi 5 with Hailo8L, I encounter the following error:

[HailoRT] [error] CHECK failed - HEF file length does not match
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_HEF(26)
[HailoRT] [error] Failed parsing HEF file
[HailoRT] [error] Failed creating HEF
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_HEF(26)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_HEF(26)
Failed to create infer model, status = 26

I have seen other posts suggesting this is an issue of incompatibility between Firmware Version: 4.17.0 (release, app, extended context switch buffer) and DFC version 3.28. I installed using the command sudo apt install hailo-all. Is there a similar update command to upgrade the Firmware Version to 4.18? The post suggests reinstalling HailoRT and pcie-driver, but it also mentions that sudo apt install hailo-all does not install the pcie-driver.this is that issue Model Zoo Model Download HEF Parser Error - General - Hailo Community

1 Like

The command hailomz compile uses the DFC tools to perform the parsing, optimization and compilation using the MZ config files to set the arguments. It’s your choice if to use the ModelZoo or purely the DFC and deal with all the configuration yourself.

The calibset images need to accurately represent the images that will be used for inference. For the calibration step (statistics collection), only 64 images is enough. If you would like to use more advanced post-quantization algorithms you may need at least 1024 instances.

The simplest solution would be to re-install the compatible DFC version (3.27.0) and re-compile the model.

sorry,I don’t really understand.if i use the mdoelzoo,Does this mean that the model cannot be modified and can only be used directly? If retraining a new model about yolov5, is it necessary to use DFC for a series of processing steps?

BTW,I can’t find version 3.27;the offical web currently only has version 3.28

Check for archived versions:
image

There are no issues with modifying models. Compiling re-trained/custom models is completely fine with both the ModelZoo and the DFC - as long as the operations are supported.

All the compilation steps are performed with the DFC and also the ModelZoo. The difference is that in the ModelZoo it can all be performed with only one command, by using configuration files to configure each step.

wow,my fault i got it!

Ok, I kind of get it!I continue to explore