update_reshape_output_format return a none

I’m trying to convert a ONNX model of paddle OCR. During the parsing phase, I got below error messages:

Traceback (most recent call last):
  File "/opt/hailo8-compiler/bin/test.py", line 13, in <module>
    hn, npz = runner.translate_onnx_model(
  File "/opt/hailo8-compiler/lib/python3.10/site-packages/hailo_sdk_common/states/states.py", line 16, in wrapped_func
    return func(self, *args, **kwargs)
  File "/opt/hailo8-compiler/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py", line 1177, in translate_onnx_model
    parser.translate_onnx_model(
  File "/opt/hailo8-compiler/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 280, in translate_onnx_model
    parsing_results = self._parse_onnx_model_to_hn(
  File "/opt/hailo8-compiler/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 320, in _parse_onnx_model_to_hn
    return self.parse_model_to_hn(
  File "/opt/hailo8-compiler/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 371, in parse_model_to_hn
    fuser = HailoNNFuser(converter.convert_model(), net_name, converter.end_node_names)
  File "/opt/hailo8-compiler/lib/python3.10/site-packages/hailo_sdk_client/model_translator/translator.py", line 83, in convert_model
    self._create_layers()
  File "/opt/hailo8-compiler/lib/python3.10/site-packages/hailo_sdk_client/model_translator/edge_nn_translator.py", line 38, in _create_layers
    self._update_vertices_info()
  File "/opt/hailo8-compiler/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_translator.py", line 316, in _update_vertices_info
    node.update_output_format()
  File "/opt/hailo8-compiler/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py", line 506, in update_output_format
    self.update_reshape_output_format(input_format)
  File "/opt/hailo8-compiler/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py", line 347, in update_reshape_output_format
    elif len(output_shapes) == len(input_format) == 4:
TypeError: object of type 'NoneType' has no len()

I added some code for debug in update_reshape_output_format method and got below outputs:

update_reshap_output_format self.name: p2o.Reshape.67
update_reshap_output_format **input_format: [batch, width, channels]**
update_reshap_output_format **self.output_format: None**

It looks like that the method “update_reshape_output_format” of ONNXGraphNode does not know how to update the output_format of the reshape node.

Can anyone guide me or give me advice to solve this problem?

Hey @Frank_Wu ,

Based on the traceback and debug output you shared:

TypeError: object of type 'NoneType' has no len()

This is happening in the update_reshape_output_format() method during ONNX parsing. Your debug logs show:

self.name = "p2o.Reshape.67"
input_format = [batch, width, channels]
self.output_format = None

After looking at your Netron visualization, I can see the issue clearly. The Reshape node in your model:

  • Receives inputs from a Concat(axis=0) and an Add(B=360)
  • Feeds into a Transpose operation
  • Is missing the second input tensor that defines the target shape

This is a common issue we see with models converted from Paddle frameworks, where the Reshape operations rely on dynamic shape inference that our DFC can’t resolve.

Here are two solutions that have worked for other customers:

Solution 1: Use ONNX Simplifier (Recommended)

First, try using onnxsim to simplify and add static shapes to your model:

pip install onnxsim
python3 -m onnxsim model.onnx model_simplified.onnx

This tool will attempt to infer the missing shapes and inject them as constant tensors, which our compiler can then process correctly.

Solution 2: Manual ONNX Patching

If simplification doesn’t work, you’ll need to patch your ONNX file to add the missing shape input to the problematic Reshape node.
You may need to adjust the target shape dimensions based on your specific model architecture.

Why This Happens

This is mentioned in our DFC User Guide :

“Some ONNX models, especially from Paddle or PyTorch, may contain layers that rely on dynamic shape propagation. It is highly recommended to preprocess and simplify the model to inject static shapes where possible.”

PaddleOCR models often use this dynamic reshaping pattern that doesn’t translate cleanly to ONNX without additional processing.

Best Practices

From our experience with customer models:

  1. Always run onnxsim on Paddle-exported ONNX models before using with Hailo DFC
  2. When working with OCR models, pay special attention to Reshape operations

Let me know if this resolves your issue, or if you need further assistance with the model conversion process!