ClientRunner translate_onnx_model() IndexError: list index out of range

I am trying to translate an ONNX model like so:

runner = ClientRunner(hw_arch=chosen_hw_arch)

hn, npz = runner.translate_onnx_model(
    onnx_path,
    onnx_model_name,
    start_node_names=["input"],
    end_node_names=["output"],
    net_input_shapes={"input": [1, 16000]},
    disable_rt_metadata_extraction=True,
)

This generates the following messages:

[info] Translation started on ONNX model non_percussive_mymodel_768_8_augmentation_model
[info] Restored ONNX model non_percussive_mymodel_768_8_augmentation_model (completion time: 00:00:00.06)
[warning] This model has non-default (reflective/edge) padding layers which are not supported currently, and were replaced with zero padding. When the padding precedes pooling layers, we expect slight degradation in the parsed model. For more info about padding modes, please refer to https://github.com/onnx/onnx/blob/main/docs/Changelog.md#pad-11
[info] Simplified ONNX model for a parsing retry attempt (completion time: 00:00:01.21)
[warning] This model has non-default (reflective/edge) padding layers which are not supported currently, and were replaced with zero padding. When the padding precedes pooling layers, we expect slight degradation in the parsed model. For more info about padding modes, please refer to https://github.com/onnx/onnx/blob/main/docs/Changelog.md#pad-11

Then, it fails with:

---------------------------------------------------------------------------
IndexError                                Traceback (most recent call last)
File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py:235, in Parser.translate_onnx_model(self, model, net_name, start_node_names, end_node_names, net_input_shapes, augmented_path, disable_shape_inference, disable_rt_metadata_extraction, net_input_format, **kwargs)
    234 try:
--> 235     parsing_results = self._parse_onnx_model_to_hn(
    236         onnx_model=onnx_model,
    237         net_name=valid_net_name,
    238         start_node_names=start_node_names,
    239         end_node_names=end_node_names,
    240         net_input_shapes=net_input_shapes,
    241         disable_shape_inference=disable_shape_inference,
    242         net_input_format=net_input_format,
    243     )
    245 except Exception as e:

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py:316, in Parser._parse_onnx_model_to_hn(self, onnx_model, net_name, start_node_names, end_node_names, net_input_shapes, disable_shape_inference, net_input_format, **kwargs)
    314         self._logger.warning(f"ONNX shape inference failed: {e!s}")
--> 316 return self.parse_model_to_hn(
    317     onnx_model,
    318     None,
    319     net_name,
    320     start_node_names,
    321     end_node_names,
    322     nn_framework=NNFramework.ONNX,
    323     output_shapes=output_shapes,
    324     net_input_format=net_input_format,
    325     **kwargs,
    326 )

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py:367, in Parser.parse_model_to_hn(self, model, values, net_name, start_node_names, end_node_names, nn_framework, output_shapes, net_input_format, rename_layers_by_blocks)
    365     raise BackendRuntimeException(f"Unsupported NN framework {nn_framework}")
--> 367 fuser = HailoNNFuser(converter.convert_model(), net_name, converter.end_node_names)
    368 hailo_nn = fuser.convert_model()

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/translator.py:83, in HailoNNConverter.convert_model(self)
     82 self._validate_bn_ops_in_training()
---> 83 self._create_layers()
     84 self._add_layers_connections()

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py:40, in EdgeNNConverter._create_layers(self)
     39 self._update_vertices_info()
---> 40 self._add_direct_layers()
     41 self._validate_processed_vertices()

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py:122, in EdgeNNConverter._add_direct_layers(self)
    121 self._logger.debug(f"Processing vertex {vertex.name}")
--> 122 self._layer_callback_from_vertex(vertex)
    123 self._visited_states[vertex] = VertexState.PROCESSED

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_translator.py:282, in ONNXConverter._layer_callback_from_vertex(self, vertex)
    280     raise UnsupportedOperationError(msg)
--> 282 if vertex.is_null_operation() and not is_flattened_global_maxpool:
    283     consumed_vertices = self._create_null_layer(vertex)

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py:4986, in ONNXGraphNode.is_null_operation(self)
   4984             return True
-> 4986 if self.op in PAD_OPS and self.is_null_padding():
   4987     return True

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py:1127, in ONNXGraphNode.is_null_padding(self)
   1126     return False
-> 1127 _, pads, _, _ = self.get_vertex_padding()
   1128 return pads == [0, 0, 0, 0, 0, 0]

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py:1096, in ONNXGraphNode.get_vertex_padding(self)
   1095     # known issue following deprecation in onnx proto (https://github.com/NVIDIA/TensorRT/issues/195)
-> 1096     pads = [pads[1], pads[2], pads[3], pads[5], pads[6], pads[7]]
   1098 # get padding type from auto_pad attr, or infer from paddings vals

IndexError: list index out of range

During handling of the above exception, another exception occurred:

IndexError                                Traceback (most recent call last)
Cell In [19], line 3
      1 runner = ClientRunner(hw_arch=chosen_hw_arch)
----> 3 hn, npz = runner.translate_onnx_model(
      4     onnx_path,
      5     onnx_model_name,
      6     start_node_names=["input"],
      7     end_node_names=["output"],
      8     net_input_shapes={"input": [1, 16000]},
      9     disable_rt_metadata_extraction=True,
     10 )

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_common/states/states.py:16, in allowed_states.<locals>.wrap.<locals>.wrapped_func(self, *args, **kwargs)
     12 if self._state not in states:
     13     raise InvalidStateException(
     14         f"The execution of {func.__name__} is not available under the state: {self._state.value}",
     15     )
---> 16 return func(self, *args, **kwargs)

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py:1158, in ClientRunner.translate_onnx_model(self, model, net_name, start_node_names, end_node_names, net_input_shapes, augmented_path, disable_shape_inference, disable_rt_metadata_extraction, net_input_format, **kwargs)
   1115 """
   1116 DFC API for parsing an ONNX model. This creates a runner with loaded HN (model) and
   1117 parameters.
   (...)
   1155 
   1156 """
   1157 parser = Parser()
-> 1158 parser.translate_onnx_model(
   1159     model=model,
   1160     net_name=net_name,
   1161     start_node_names=start_node_names,
   1162     end_node_names=end_node_names,
   1163     net_input_shapes=net_input_shapes,
   1164     augmented_path=augmented_path,
   1165     disable_shape_inference=disable_shape_inference,
   1166     disable_rt_metadata_extraction=disable_rt_metadata_extraction,
   1167     net_input_format=net_input_format,
   1168     **kwargs,
   1169 )
   1171 return self._finalize_parsing(parser.return_data)

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py:276, in Parser.translate_onnx_model(self, model, net_name, start_node_names, end_node_names, net_input_shapes, augmented_path, disable_shape_inference, disable_rt_metadata_extraction, net_input_format, **kwargs)
    273     milestone = self._format_time_milestone(start_time)
    274     self._logger.info(f"Simplified ONNX model for a parsing retry attempt (completion time: {milestone})")
--> 276     parsing_results = self._parse_onnx_model_to_hn(
    277         onnx_model=simplified_model,
    278         net_name=valid_net_name,
    279         start_node_names=start_node_names,
    280         end_node_names=end_node_names,
    281         net_input_shapes=net_input_shapes,
    282         disable_shape_inference=disable_shape_inference,
    283         net_input_format=net_input_format,
    284         **kwargs,
    285     )
    287 milestone = self._format_time_milestone(start_time)
    288 self._logger.info(f"Translation completed on ONNX model {valid_net_name} (completion time: {milestone})")

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py:316, in Parser._parse_onnx_model_to_hn(self, onnx_model, net_name, start_node_names, end_node_names, net_input_shapes, disable_shape_inference, net_input_format, **kwargs)
    313     except Exception as e:
    314         self._logger.warning(f"ONNX shape inference failed: {e!s}")
--> 316 return self.parse_model_to_hn(
    317     onnx_model,
    318     None,
    319     net_name,
    320     start_node_names,
    321     end_node_names,
    322     nn_framework=NNFramework.ONNX,
    323     output_shapes=output_shapes,
    324     net_input_format=net_input_format,
    325     **kwargs,
    326 )

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py:367, in Parser.parse_model_to_hn(self, model, values, net_name, start_node_names, end_node_names, nn_framework, output_shapes, net_input_format, rename_layers_by_blocks)
    364 else:
    365     raise BackendRuntimeException(f"Unsupported NN framework {nn_framework}")
--> 367 fuser = HailoNNFuser(converter.convert_model(), net_name, converter.end_node_names)
    368 hailo_nn = fuser.convert_model()
    369 hailo_nn.validate_stage(HnStage.HN)

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/translator.py:83, in HailoNNConverter.convert_model(self)
     81 self._validate_model_params()
     82 self._validate_bn_ops_in_training()
---> 83 self._create_layers()
     84 self._add_layers_connections()
     85 self._layers_graph.set_names_and_indices()

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py:40, in EdgeNNConverter._create_layers(self)
     38 self._add_input_layers()
     39 self._update_vertices_info()
---> 40 self._add_direct_layers()
     41 self._validate_processed_vertices()

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py:122, in EdgeNNConverter._add_direct_layers(self)
    120 elif vertex not in self._visited_states:
    121     self._logger.debug(f"Processing vertex {vertex.name}")
--> 122     self._layer_callback_from_vertex(vertex)
    123     self._visited_states[vertex] = VertexState.PROCESSED
    125 for node in sorted(self._graph.successors(vertex), key=attrgetter("name")):

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_translator.py:282, in ONNXConverter._layer_callback_from_vertex(self, vertex)
    279     msg = f"{vertex.op} operation is unsupported"
    280     raise UnsupportedOperationError(msg)
--> 282 if vertex.is_null_operation() and not is_flattened_global_maxpool:
    283     consumed_vertices = self._create_null_layer(vertex)
    284 elif vertex.op in CONV2D_OPS:

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py:4986, in ONNXGraphNode.is_null_operation(self)
   4983         if len(preds) == 1 and preds[0].is_non_negative_result():
   4984             return True
-> 4986 if self.op in PAD_OPS and self.is_null_padding():
   4987     return True
   4989 if self.op in ["Reshape", "Flatten"] and self.is_null_flatten_reshape():

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py:1127, in ONNXGraphNode.is_null_padding(self)
   1125 if self.op not in PAD_OPS:
   1126     return False
-> 1127 _, pads, _, _ = self.get_vertex_padding()
   1128 return pads == [0, 0, 0, 0, 0, 0]

File /local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py:1096, in ONNXGraphNode.get_vertex_padding(self)
   1093             raise UnsupportedPaddingError(f"Could not extract padding values for vertex {self.name}.")
   1095     # known issue following deprecation in onnx proto (https://github.com/NVIDIA/TensorRT/issues/195)
-> 1096     pads = [pads[1], pads[2], pads[3], pads[5], pads[6], pads[7]]
   1098 # get padding type from auto_pad attr, or infer from paddings vals
   1099 if auto_pad_attr:

IndexError: list index out of range

All of the Pad layers in the ONNX model use constant_value = 0.

How do I solve this?

Thanks in advance!

Hey @vasco.f.santos ,

Welcome to the Hailo Community!

The IndexError: list index out of range you’re encountering in the get_vertex_padding function. This issue typically occurs when the ONNX model’s padding configuration doesn’t align with Hailo’s compiler expectations.

The error suggests that your model’s padding operations are using non-standard configurations that the Hailo Dataflow Compiler currently interprets as zero padding, causing an index misalignment.

Solution

1. Normalize Padding in ONNX Model

First, let’s modify your ONNX model to use standard zero padding:

import onnx
from onnx.tools import update_model_dims

# Load and update model
model = onnx.load("your_model.onnx")

# Standardize padding nodes
for node in model.graph.node:
    if node.op_type == "Pad":
        # Convert to zero padding
        for attr in node.attribute:
            if attr.name == 'pads':
                attr.ints = [0, 0, 0, 0]  # Standard zero padding

onnx.save(model, "updated_model.onnx")

2. Update Translation Configuration

When translating your model, try these settings:

hn, npz = runner.translate_onnx_model(
    onnx_path="updated_model.onnx",
    onnx_model_name="your_model",
    start_node_names=["input"],
    end_node_names=["output"],
    net_input_shapes={"input": [1, 16000]},
    disable_shape_inference=True,
    disable_rt_metadata_extraction=True
)

3. Model Simplification

If you’re still seeing issues, simplify your model:

python3 -m onnxsim your_model.onnx simplified_model.onnx

Let me know if you need any clarification or run into other issues!

Best Regards,
Omria