IndexError: List Index Out of Range During ONNX to HEF Conversion fro Classification Models

We are encountering a persistent issue while converting our classification models from ONNX to HEF using the Hailo Model Zoo (hailomz). After fine-tuning models (MobileNet_v2_1.0_224 and ResNet V2 50) from existing checkpoints and training on our custom TFRecord datasets, the ONNX export succeeded without issues. However, when we tried to convert the ONNX models to HEF, the process failed with the following error:

IndexError: list index (0) out of range
File “nnx_graph.py”, line 625, in parse_raw_data
parsed_data = numpy_helper.to_array(self._info.attribute[0].t)

We tried removing the batch dimension from the ONNX model, but the issue persisted. The error seems to indicate that the compiler is expecting an attribute that does not exist or is incorrectly referenced within the ONNX model. Without further documentation, we are unsure what specific attributes or indices are required for the conversion to succeed.

Could you clarify what specific attribute or index the converter expects in the ONNX model?
Are there any examples or templates available that demonstrate how to prepare ONNX models for HEF conversion, especially for classification models?

Hi @junhui,

It seems like the error message is unclear in this case, I’m bringing this up to our parser developers so it can be improved. Thanks for bringing it to our attention.

Are you using start/end node names for the parsing? If so, what are they?

Hi @nina-vilela, thank you for your reply.

Yes, we did specify the start and end node names according to our ONNX model’s nodes. The input and output node names of my ONNX model are input and output, respectively. This this the command I used for model compilation:

hailomz compile resnet_v1_50 --ckpt model.onnx --hw-arch hailo8l --start-node-names input --end-node-names output

To make it easier to troubleshoot, I will share the ONNX model file with you privately through Google Drive. Kindly let me know if you need more information.

Thanks!