I am in the process of compiling the entire YOLOv8_seg model.
The following error occurred during the parsing stage.
2025-08-20 11:04:09.431737: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2025-08-20 11:04:09.469571: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2025-08-20 11:04:10.115570: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
[info] Translation started on ONNX model yolov8s-seg
[info] Restored ONNX model yolov8s-seg (completion time: 00:00:00.22)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.86)
[info] Simplified ONNX model for a parsing retry attempt (completion time: 00:00:02.39)
Traceback (most recent call last):
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 239, in translate_onnx_model
parsing_results = self._parse_onnx_model_to_hn(
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 320, in _parse_onnx_model_to_hn
return self.parse_model_to_hn(
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 371, in parse_model_to_hn
fuser = HailoNNFuser(converter.convert_model(), net_name, converter.end_node_names)
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/translator.py", line 83, in convert_model
self._create_layers()
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py", line 39, in _create_layers
self._add_direct_layers()
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py", line 162, in _add_direct_layers
raise ParsingWithRecommendationException(
hailo_sdk_client.model_translator.exceptions.ParsingWithRecommendationException: Parsing failed. The errors found in the graph are:
UnsupportedModelError in op /model.22/Sub: In vertex /model.22/Sub_input the constant value shape (1, 2, 8400) must be broadcastable to the output shape [2, 1, 8400]
UnsupportedModelError in op /model.22/Add_1: In vertex /model.22/Add_1_input the constant value shape (1, 2, 8400) must be broadcastable to the output shape [2, 1, 8400]
Please try to parse the model again, using these end node names: /model.22/Concat, /model.22/proto/cv3/act/Mul, /model.22/Sigmoid, /model.22/Slice_1, /model.22/Slice
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/local/shared_with_docker/YOLO-jevois-split/hailo_parsing_v8s_seg.py", line 16, in <module>
hn, npz = runner.translate_onnx_model(
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_common/states/states.py", line 16, in wrapped_func
return func(self, *args, **kwargs)
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py", line 1187, in translate_onnx_model
parser.translate_onnx_model(
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 280, in translate_onnx_model
parsing_results = self._parse_onnx_model_to_hn(
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 320, in _parse_onnx_model_to_hn
return self.parse_model_to_hn(
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 371, in parse_model_to_hn
fuser = HailoNNFuser(converter.convert_model(), net_name, converter.end_node_names)
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/translator.py", line 83, in convert_model
self._create_layers()
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py", line 39, in _create_layers
self._add_direct_layers()
File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py", line 162, in _add_direct_layers
raise ParsingWithRecommendationException(
hailo_sdk_client.model_translator.exceptions.ParsingWithRecommendationException: Parsing failed. The errors found in the graph are:
UnsupportedModelError in op /model.22/Sub: In vertex /model.22/Sub_input the constant value shape (1, 2, 8400) must be broadcastable to the output shape [2, 1, 8400]
UnsupportedModelError in op /model.22/Add_1: In vertex /model.22/Add_1_input the constant value shape (1, 2, 8400) must be broadcastable to the output shape [2, 1, 8400]
To summarize, the contents are as follows.
hailo_sdk_client.model_translator.exceptions.ParsingWithRecommendationException: Parsing failed. The errors found in the graph are:
UnsupportedModelError in op /model.22/Sub: In vertex /model.22/Sub_input the constant value shape (1, 2, 8400) must be broadcastable to the output shape [2, 1, 8400]
UnsupportedModelError in op /model.22/Add_1: In vertex /model.22/Add_1_input the constant value shape (1, 2, 8400) must be broadcastable to the output shape [2, 1, 8400]
And here is an image of the relevant section. There are two Add and Sub operations; the ones in question are the upper Add and Sub.
I believe that when performing Sub and Add between tensors of shape (1, 2, 8400), the output should also be (1, 2, 8400).
However, Hailo seems to indicate that the output must be (2, 1, 8400).
Is my understanding correct? If so, would forcing the Sub and Add output shape to (2, 1, 8400) by editing the ONNX make it work without issues?
Thank you for reading.
