Hailo parser loses spatial metadata and ignores reshape nodes, breaks the graph

I am trying to parse a part of a siamese object tracking model from ONNX to HAR. Very early in the graph after reshapes the parser ‘‘forgets’’ the spatial metadata of data tensors. Because of that, the parser simply ignores reshape operations from the onnx graph, applies spatial operations incorrectly and so on. Please find attached parts of the ONNX and HAR graphs in question. One can see that in HAR after the concatenation the reshape node is ignored, and afterwards the external_pad node before the depthwise convolution pads wrong dimensions, resulting in tensor becoming 3x322x256 and producing some 3x320 nonsense data afterfards, instead of splitting it and then padding to 1x320x18x18

I tried summing the tensor with a zero matrix or performing an identity convolution after the concatenation node to force the parser to infer the intended shape. However, in every case the model failed to be parsed because of shape mismatch. The parser omits the reshape node and then breaks because of invalid shapes it gets as a result.

The same problem appears really anywhere in the graph after the input data is reshaped once. One can see on a screenshot I attached that at the beggining of the graph the parser correctly identifies the same depthwise convolution and properly parses it.

The third image I attached also shows that after the first reshape (which was parsed as ‘‘format_conversion”, the data tensor gets an extra 1x dimension at the start, while HAR graphs usually do not have them. I interpret this as a sign of the parser losing the spatial metadata.

What could be done to resolve this and correctly pass the model? So far I feel like I tried everything I could on PyTorch / ONNX side. Nothing I did with the model source code affected this, the problem still persists.

Thank you for your help!

Hi Yakiv,

Can you please share more information so we can better analyze and assist with the issues you are facing?

  • ONNX model file or partial part of the model in ONNX format in case you can’t share entire model.
  • HAR file you obtained from parsing.
  • PyTorch parameters you used for exporting the model: torch version, onnx opset version, dynamo True/False, etc.
  • Hailo parser parameters you used in your attempts: DFC version, start/end node names, etc.

If you would like some tips for other things to try in case you haven’t tried already:

  • Try to run onnxsim or onnxslim on the model before trying to parse it.
  • Try exporting from PyTorch with dynamo=False in case you used True value for that.

Hello!
Sorry for making you wait long.
The .onnx and .har files you requested are here:
https://drive.proton.me/urls/4THENG2JN8#BaXO0una0jdz

Speaking of export parameters, we are stuck on Python 3.7 for this pipeline, so dynamo=False is not supported for me. I do torch.onnx.export with opset version 11 and do_constant_folding=True.

On the hailo parser side, I use –hw-arch hailo8l and nothing else, start and end nodes are left default.

Thank you for your time and assistance!