Hello,
I am running into a problem compiling Conformer model.
Commands I have tried:
hailo parser onnx --hw-arch hailo8 conformer_hailo_compatible.onnx --har-path conformer_model.har --input-format input=NCF
hailo parser onnx --hw-arch hailo8 conformer_encoder_only.onnx --har-path conformer_encoder.har --input-format input=NCF
hailo parser onnx --hw-arch hailo8 conformer_encoder_only.onnx --har-path conformer_encoder.har --input-format input=NCHW
hailo parser onnx --hw-arch hailo8 conformer_encoder_only.onnx --har-path conformer_encoder.har --tensor-shapes input=[1,80,100]
They all end up an error similar that one:
hailo parser onnx --hw-arch hailo8 conformer_hailo_compatible.onnx --har-path conformer_encoder.har --input-format input=NCHW
[info] No GPU chosen and no suitable GPU found, falling back to CPU.
[info] Current Time: 21:09:58, 11/06/25
[info] CPU: Architecture: x86_64, Model: INTEL(R) XEON(R) GOLD 5520+, Number Of Cores: 112, Utilization: 0.0%
[info] Memory: Total: 503GB, Available: 493GB
[info] System info: OS: Linux, Kernel: 6.8.0-86-generic
[info] Hailo DFC Version: 3.33.0
[info] HailoRT Version: 4.23.0
[info] PCIe: No Hailo PCIe device was found
[info] Running hailo parser onnx --hw-arch hailo8 conformer_hailo_compatible.onnx --har-path conformer_encoder.har --input-format input=NCHW
[info] Translation started on ONNX model conformer_hailo_compatible
[warning] Large model detected. The graph may contain either a large number of operators, or weight variables with a very large capacity.
[warning] Translation time may be a bit long, and some features may be disabled (e.g. model augmentation, retry simplified model, onnx runtime hailo model extraction, etc.).
[info] Restored ONNX model conformer_hailo_compatible (completion time: 00:00:02.31)
Traceback (most recent call last):
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/tools/parser_cli.py”, line 212, in run
self._parse(net_name, args, tensor_shapes)
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/tools/parser_cli.py”, line 298, in _parse
self.runner.translate_onnx_model(
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_common/states/states.py”, line 16, in wrapped_func
return func(self, *args, **kwargs)
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py”, line 1192, in translate_onnx_model
parser.translate_onnx_model(
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 248, in translate_onnx_model
raise e from None
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 235, in translate_onnx_model
parsing_results = self._parse_onnx_model_to_hn(
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 316, in _parse_onnx_model_to_hn
return self.parse_model_to_hn(
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 359, in parse_model_to_hn
fuser = HailoNNFuser(converter.convert_model(), net_name, converter.end_node_names)
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/translator.py”, line 82, in convert_model
self._create_layers()
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py”, line 37, in _create_layers
self._add_input_layers()
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py”, line 72, in _add_input_layers
input_shapes = vertex.get_input_layer_shapes()
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py”, line 6313, in get_input_layer_shapes
raise UnsupportedInputFormatError(msg, recommendation=DEFAULT_FORMAT_BY_RANK.get(rank))
hailo_sdk_client.model_translator.exceptions.UnsupportedInputFormatError: Input format for input: [batch, channels, height, width] doesn’t match its rank (3).
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File “/local/workspace/hailo_virtualenv/bin/hailo”, line 8, in
sys.exit(main())
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/tools/cmd_utils/main.py”, line 111, in main
ret_val = client_command_runner.run()
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_platform/tools/hailocli/main.py”, line 64, in run
return self._run(argv)
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_platform/tools/hailocli/main.py”, line 104, in _run
return args.func(args)
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/tools/parser_cli.py”, line 228, in run
raise ParserCLIException(msg) from err
hailo_sdk_client.tools.parser_cli.ParserCLIException: Input format for input: [batch, channels, height, width] doesn’t match its rank (3). Please try parsing the model again, for example: hailo parser onnx --hw-arch hailo8 conformer_hailo_compatible.onnx --har-path conformer_encoder.har --input-format BWC
I would appreciate feed back and guidance.