Yolo-NAS onnx to HEF

Hi all,

I am trying to convert a Yolo-NAS medium model in onnx format to HEF. I get a rather undescriptive error and I’m not sure where to go from here.

Has anyone successfully converted a Yolo-NAS model before? Anything I can try to fix this error?

Thanks!
Here is my code:

from hailo_sdk_client import ClientRunner
model_name = "m5"
onnx_path = "./5_3_25_medium.onnx"
chosen_hw_arch = "hailo8r"
 
runner = ClientRunner(hw_arch=chosen_hw_arch)
hn, npz = runner.translate_onnx_model(onnx_path, model_name,
    end_node_names=["/mode/heads/Conv_2", "/mode/heads/Sigmoid", "/mode/heads/Conv_1", "/mode/heads/Conv"])

The output:

[info] Translation started on ONNX model m5
[info] Restored ONNX model m5 (completion time: 00:00:00.66)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:01.19)
[info] Unable to simplify the model: 'NoneType' object has no attribute 'in_valid_subgraph'
Traceback (most recent call last):
  File "/home/localadmin/Code/Hailo/parse.py", line 8, in <module>
    hn, npz = runner.translate_onnx_model(onnx_path, model_name, 
  File "/home/localadmin/Code/Hailo/.venv/lib/python3.10/site-packages/hailo_sdk_common/states/states.py", line 16, in wrapped_func
    return func(self, *args, **kwargs)
  File "/home/localadmin/Code/Hailo/.venv/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py", line 1158, in translate_onnx_model
    parser.translate_onnx_model(
  File "/home/localadmin/Code/Hailo/.venv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 242, in translate_onnx_model
    raise e from None
  File "/home/localadmin/Code/Hailo/.venv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 220, in translate_onnx_model
    parsing_results = self._parse_onnx_model_to_hn(
  File "/home/localadmin/Code/Hailo/.venv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 300, in _parse_onnx_model_to_hn
    return self.parse_model_to_hn(
  File "/home/localadmin/Code/Hailo/.venv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 340, in parse_model_to_hn
    converter = ONNXConverter(
  File "/home/localadmin/Code/Hailo/.venv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_translator.py", line 170, in __init__
    super().__init__(
  File "/home/localadmin/Code/Hailo/.venv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py", line 27, in __init__
    super().__init__(graph, start_node_names, end_node_names)
  File "/home/localadmin/Code/Hailo/.venv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/translator.py", line 51, in __init__
    self._calculate_valid_subgraph_scope()
  File "/home/localadmin/Code/Hailo/.venv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/translator.py", line 388, in _calculate_valid_subgraph_scope
    current_vertex.in_valid_subgraph = True
AttributeError: 'NoneType' object has no attribute 'in_valid_subgraph'
1 Like

Would love to hear if anyone else has made progress on getting a YOLO-NAS model optimized for and running on the Hailo-8. Thank you!

1 Like

I ended up switching to YoloDamo since that has a similar license and is officially supported by Hailo.

Can you try your ONNX on the Model-Zoo with this yaml file?

base: ["base/yolov8.yaml"]

postprocessing:
  anchors:
    regression_length: 16
  device_pre_post_layers:
      sigmoid: false
  score_threshold: 0.01

network:
  network_name: "yolo_nas_m"

paths:
  alls_script: yolo_nas_m.alls

parser:
  nodes: [null, ["/heads/head1/reg_pred/Conv", "/heads/head1/cls_pred/Conv",
                 "/heads/head2/reg_pred/Conv", "/heads/head2/cls_pred/Conv",
                 "/heads/head3/reg_pred/Conv", "/heads/head3/cls_pred/Conv"]]

With this alls:

normalization1 = normalization([0.0, 0.0, 0.0], [255.0, 255.0, 255.0])
1 Like

can you elaborate on what command it would be to run this? I am trying to do this but not 100% which hailomz command to use with this config for a yolo-nas model?

Hi @nick2 ,
To convert an onnx format of the model to a HEF using the Model-Zoo one should use the compile command as follows:

hailomz compile yolo_nas_m --hw-arch <TARGET_DEVICE> --yaml <YAML_PATH> --ckpt <ONNX_PATH> --model-script <MODEL_SCRIPT_PATH>

You can run hailomz compile -h for additional options and documentation.

1 Like

Thanks, it looks like it is not valid to run it with a model_name and yaml config.

When I run hailomz compile --yaml yolo_nas_config.yaml --ckpt nas.onnx I get

[info] Translation started on ONNX model yolo_nas_m
[info] Restored ONNX model yolo_nas_m (completion time: 00:00:00.11)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.39)
Traceback (most recent call last):
  File "/local/workspace/hailo_virtualenv/bin/hailomz", line 33, in <module>
    sys.exit(load_entry_point('hailo-model-zoo', 'console_scripts', 'hailomz')())
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py", line 122, in main
    run(args)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py", line 111, in run
    return handlers[args.command](args)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 248, in compile
    _ensure_optimized(runner, logger, args, network_info)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 73, in _ensure_optimized
    _ensure_parsed(runner, logger, network_info, args)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 108, in _ensure_parsed
    parse_model(runner, network_info, ckpt_path=args.ckpt_path, results_dir=args.results_dir, logger=logger)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/core/main_utils.py", line 126, in parse_model
    raise Exception(f"Encountered error during parsing: {err}") from None
Exception: Encountered error during parsing: Unable to find end node names: ['/heads/head1/reg_pred/Conv', '/heads/head1/cls_pred/Conv', '/heads/head2/reg_pred/Conv', '/heads/head2/cls_pred/Conv', '/heads/head3/reg_pred/Conv', '/heads/head3/cls_pred/Conv'], please verify and try again.


You want the names of the cyan convolutions.

For me these are (from left to right):

/model/heads/head1/cls_pred/Conv
/model/heads/head2/cls_pred/Conv
/model/heads/head3/cls_pred/Conv
/model/heads/Conv
/model/heads/Conv_1
/model/heads/Conv_2

If that doesn’t work, I would try the first cyan block with the magenta block:

For me these are:

/model/heads/head1/cls_pred/Conv
/model/heads/head2/cls_pred/Conv
/model/heads/head3/cls_pred/Conv
/model/heads/head1/reg_pred/Conv
/model/heads/head2/reg_pred/Conv
/model/heads/head3/reg_pred/Conv

You can find the names in your onnx in Netron.

Thank you, here is the output of yolo-nas in flat output format. Should I only be looking at the later areas where there is a row of blue conv nodes? How do you know the name (/model/heads/head1/cls_pred/Conv vs /model/heads/Conv)?

Yes, look near the bottom where you should see an identical graph. Click on the Conv to see the name, it will appear in a panel on the right hand side of Netron.

Thank you, that makes sense. So I ran with this config

base: ["base/yolov8.yaml"]

postprocessing:
  anchors:
    regression_length: 16
  device_pre_post_layers:
      sigmoid: false
  score_threshold: 0.01

network:
  network_name: "yolo_nas_m"

paths:
  alls_script: yolo_nas_m.alls

parser:
  nodes: ["/model/heads/head1/cls_pred/Conv", "/model/heads/head2/cls_pred/Conv", "/model/heads/head3/cls_pred/Conv", "/model/heads/Conv", "/model/heads/Conv_1", "/model/heads/Conv_2"]

and I got

<Hailo Model Zoo INFO> Start run for network yolo_nas_m ...
<Hailo Model Zoo INFO> Initializing the hailo8 runner...
[info] Translation started on ONNX model yolo_nas_m
[info] Restored ONNX model yolo_nas_m (completion time: 00:00:00.09)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.60)
[info] Simplified ONNX model for a parsing retry attempt (completion time: 00:00:01.03)
Traceback (most recent call last):
  File "/local/workspace/hailo_virtualenv/bin/hailomz", line 33, in <module>
    sys.exit(load_entry_point('hailo-model-zoo', 'console_scripts', 'hailomz')())
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py", line 122, in main
    run(args)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py", line 111, in run
    return handlers[args.command](args)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 248, in compile
    _ensure_optimized(runner, logger, args, network_info)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 73, in _ensure_optimized
    _ensure_parsed(runner, logger, network_info, args)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 108, in _ensure_parsed
    parse_model(runner, network_info, ckpt_path=args.ckpt_path, results_dir=args.results_dir, logger=logger)
  File "/local/workspace/hailo_model_zoo/hailo_model_zoo/core/main_utils.py", line 126, in parse_model
    raise Exception(f"Encountered error during parsing: {err}") from None
Exception: Encountered error during parsing: The original node name /model/heads/head2/cls_pred/Conv in end_node_names is missing in the HN.