ERROR: Trying to convert a DEIMv2 model to .hef format

Dear Hailo team and community,

We have run into some issues with your compiler and parser when trying to parse a custom model:

Intellindust-AI-Lab/DEIMv2: [DEIMv2] Real Time Object Detection Meets DINOv3 -

We couldn’t even parse it to the har format because it sends us that error message:

Our code to translate onnx model to hef

runner = ClientRunner(hw_arch=chosen_hw_arch)
hn, npz = runner.translate_onnx_model(
onnx_path,
onnx_model_name,
start_node_names=[“images”, “orig_target_sizes”],
net_input_shapes={“images”: [1,3, image_h, image_w], “orig_target_sizes”: [1,2]},

)

ERROR:

[info] Translation started on ONNX model DEIMv2_s
[warning] Large model detected. The graph may contain either a large number of operators, or weight variables with a very large capacity.
[warning] Translation time may be a bit long, and some features may be disabled (e.g. model augmentation, retry simplified model, onnx runtime hailo model extraction, etc.).
[info] Restored ONNX model DEIMv2_s (completion time: 00:00:00.50)

IndexError Traceback (most recent call last)
Cell In [5], line 2
1 runner = ClientRunner(hw_arch=chosen_hw_arch)
----> 2 hn, npz = runner.translate_onnx_model(
3 onnx_path,
4 onnx_model_name,
5 start_node_names=[“images”, “orig_target_sizes”],
6 net_input_shapes={“images”: [1,3, image_h, image_w], “orig_target_sizes”: [1,2]},
7
8 )

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_common/states/states.py:16, in allowed_states..wrap..wrapped_func(self, *args, **kwargs)
12 if self._state not in states:
13 raise InvalidStateException(
14 f"The execution of {func._name_} is not available under the state: {self._state.value}",
15 )
—> 16 return func(self, *args, **kwargs)

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py:1187, in ClientRunner.translate_onnx_model(self, model, net_name, start_node_names, end_node_names, net_input_shapes, augmented_path, disable_shape_inference, disable_rt_metadata_extraction, net_input_format, **kwargs)
1144 “”"
1145 DFC API for parsing an ONNX model. This creates a runner with loaded HN (model) and
1146 parameters.
(…)
1184
1185 “”"
1186 parser = Parser()
→ 1187 parser.translate_onnx_model(
1188 model=model,
1189 net_name=net_name,
1190 start_node_names=start_node_names,
1191 end_node_names=end_node_names,
1192 net_input_shapes=net_input_shapes,
1193 augmented_path=augmented_path,
1194 disable_shape_inference=disable_shape_inference,
1195 disable_rt_metadata_extraction=disable_rt_metadata_extraction,
1196 net_input_format=net_input_format,
1197 **kwargs,
1198 )
1199 return self._finalize_parsing(parser.return_data)

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py:252, in Parser.translate_onnx_model(self, model, net_name, start_node_names, end_node_names, net_input_shapes, augmented_path, disable_shape_inference, disable_rt_metadata_extraction, net_input_format, **kwargs)
250 irrelevant_exception = isinstance(e, (MisspellNodeError, UnsupportedInputFormatError))
251 if large_model_detected or long_model_detected or irrelevant_exception:
→ 252 raise e from None
254 try:
255 simplified_model, is_valid = onnxsim.simplify(onnx_model, skip_fuse_bn=True)

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py:239, in Parser.translate_onnx_model(self, model, net_name, start_node_names, end_node_names, net_input_shapes, augmented_path, disable_shape_inference, disable_rt_metadata_extraction, net_input_format, **kwargs)
236 onnx.save_model(onnx_model, augmented_path)
238 try:
→ 239 parsing_results = self._parse_onnx_model_to_hn(
240 onnx_model=onnx_model,
241 net_name=valid_net_name,
242 start_node_names=start_node_names,
243 end_node_names=end_node_names,
244 net_input_shapes=net_input_shapes,
245 disable_shape_inference=disable_shape_inference,
246 net_input_format=net_input_format,
247 )
249 except Exception as e:
250 irrelevant_exception = isinstance(e, (MisspellNodeError, UnsupportedInputFormatError))

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py:320, in Parser._parse_onnx_model_to_hn(self, onnx_model, net_name, start_node_names, end_node_names, net_input_shapes, disable_shape_inference, net_input_format, **kwargs)
317 except Exception as e:
318 self._logger.warning(f"ONNX shape inference failed: {e!s}")
→ 320 return self.parse_model_to_hn(
321 onnx_model,
322 None,
323 net_name,
324 start_node_names,
325 end_node_names,
326 nn_framework=NNFramework.ONNX,
327 output_shapes=output_shapes,
328 net_input_format=net_input_format,
329 **kwargs,
330 )

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py:371, in Parser.parse_model_to_hn(self, model, values, net_name, start_node_names, end_node_names, nn_framework, output_shapes, net_input_format, rename_layers_by_blocks)
368 else:
369 raise BackendRuntimeException(f"Unsupported NN framework {nn_framework}")
→ 371 fuser = HailoNNFuser(converter.convert_model(), net_name, converter.end_node_names)
372 hailo_nn = fuser.convert_model()
373 hailo_nn.validate_stage(HnStage.HN)

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/translator.py:83, in HailoNNConverter.convert_model(self)
81 self._validate_model_params()
82 self._validate_bn_ops_in_training()
—> 83 self._create_layers()
84 self._add_layers_connections()
85 self._layers_graph.set_names_and_indices()

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py:38, in EdgeNNConverter._create_layers(self)
36 self._visited_states = {}
37 self._add_input_layers()
—> 38 self._update_vertices_info()
39 self._add_direct_layers()
40 self._validate_processed_vertices()

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_translator.py:320, in ONNXConverter._update_vertices_info(self)
318 found_non_const_padding = False
319 for node in self.graph.nodes_toposorted():
→ 320 node.update_output_format()
321 if node.op in POOL_OPS and node.is_global_pool():
322 node.is_spatial_1x1 = True

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py:544, in ONNXGraphNode.update_output_format(self)
541 out_format = [dim if dim != Dims.GROUPS else Dims.CHANNELS for dim in self.output_format]
542 self.output_format = out_format
→ 544 elif self.op == “MatMul” and self.is_matmul_layer():
545 self.output_format = self.get_matmul_layer_info()[-1]
547 elif self.op == “Reshape”:

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py:4293, in ONNXGraphNode.is_matmul_layer(self)
4290 if self.op != “MatMul”:
4291 return False
→ 4293 kernel, _ = self.get_kernel(is_conv2d=False)
4294 return (self.is_ew_op() and kernel is None) or len(kernel.shape) == 4

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py:868, in ONNXGraphNode.get_kernel(self, is_conv2d)
866 consumed_vertices.extend(pred.get_dense_reshape_vertices())
867 if pred.op in CONST_OPS:
→ 868 const_shape = pred._info.attribute[0].t.dims
869 if len(const_shape) > 1:
870 consumed_vertices.append(pred)

IndexError: list index (0) out of range

Please help us solve that issue!

Also we would be very grateful on the guides on how to compile that model into .hef maybe there are already successful cases or codes available.

Thank you!

Hey @roman.karpenko ,

Welcome to the Hailo Community!

From what I can see in your error log, the parser is crashing when it hits a MatMul operation - specifically, it’s trying to access an attribute that doesn’t exist (IndexError: list index (0) out of range). This is happening because a constant node feeding into the MatMul is either missing expected attributes or is malformed somehow.

I also noticed the SDK flagged your model as “large,” which can sometimes cause parsing issues.

Here’s what I’d suggest trying:

  1. Simplify your ONNX model using onnx-simplifier. This often cleans up problematic nodes:

    import onnx
    from onnxsim import simplify
    
    model = onnx.load("your_model.onnx")
    model_simp, check = simplify(model)
    assert check, "Simplified ONNX model could not be validated"
    onnx.save(model_simp, "your_model_simplified.onnx")
    
  2. Inspect your model with Netron and check those constant nodes feeding into MatMul operations - make sure they have all the required attributes.

Hope this helps!