TypeError: 'NoneType' object is not subscriptable when translating a baudm/parseq onnx file to HEF

Hi,

I’m trying to convert an onnx file that was exported from GitHub - baudm/parseq: Scene Text Recognition with Permuted Autoregressive Sequence Models (ECCV 2022) project and then simplified, but I get this error:
TypeError: ‘NoneType’ object is not subscriptable

This is a part of the stack trace:

  File "/home/hailo/.local/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_translator.py", line 385, in _layer_callback_from_vertex
    if vertex.is_null_operation() and not is_flattened_global_maxpool:
  File "/home/hailo/.local/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py", line 5307, in is_null_operation
    or (self.op == "ReduceMean" and self.is_null_reduce_mean())
  File "/home/hailo/.local/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py", line 5342, in is_null_reduce_mean
    axes = self._convert_axes_to_nhwc(axes_info)
  File "/home/hailo/.local/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py", line 2250, in _convert_axes_to_nhwc
    return [nchw_to_nhwc_axis_mapping[self.input_format[axis]] for axis in axes]
  File "/home/hailo/.local/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_graph.py", line 2250, in <listcomp>
    return [nchw_to_nhwc_axis_mapping[self.input_format[axis]] for axis in axes]
TypeError: 'NoneType' object is not subscriptable

My code

from hailo_sdk_client import ClientRunner

chosen_hw_arch = "hailo8l"
onnx_model_name = "parseq_tiny_fp32_simple"
onnx_path = f"/onnx/{onnx_model_name}.onnx"

runner = ClientRunner(hw_arch=chosen_hw_arch)

hn, npz = runner.translate_onnx_model(
    onnx_path,
    onnx_model_name,
    start_node_names=["images"],
    end_node_names=["5293"],
    disable_shape_inference=True,
)

hailo_model_har_name = f"/onnx/{onnx_model_name}_hailo_model.har"
runner.save_har(hailo_model_har_name)

I have traced the error back to this node:

I am running it inside a docker container with these packages:

  • python3.10
  • hailo_dataflow_compiler-3.31.0-py3-none-linux_x86_64.whl
  • hailort-4.21.0-cp310-cp310-linux_x86_64.whl
  • hailort_4.21.0_amd64.deb

Any suggestions on how to solve this?
Thank you in advance.

Hey @maaaw ,

Welcome to the Hailo Community!

What’s Going Wrong

You’re seeing:

TypeError: 'NoneType' object is not subscriptable

This happens during a ReduceMean operation. Basically, the SDK expects self.input_format to be defined, but it’s None, so it crashes when trying to map axes.


Why This Happens

  • Your model has a ReduceMean node working over a shape like [1, 26, 192][1, 26, 1].
  • The SDK tries to evaluate if this is a “null-op” by checking the axes.
  • If shape inference is off or your model is over-simplified, then input_format doesn’t get set — and boom, crash.

How to Fix It

1. Enable shape inference

If you’re using something like this:

disable_shape_inference=True

Change it to:

disable_shape_inference=False

This should let the SDK set the input_format properly.

2. Check your model in Netron

Open it at netron.app and check:

  • That ReduceMean nodes do have the axes attribute.
  • If that’s missing, the SDK can’t parse it properly.

Hi,

I am seeing this exact error, but none of the suggested fixes work for me.

In my case, I have two ReduceMean operators following each other. Both of them have the ‘axes‘ attribute set (1 and -2 respectively), and I have set the flag ‘disable_shape_inference=False’. I get the error regardless of any simplification of the net.

Do you have any other suggestions of what to do?

Hey @Magnus_Grandin,

Your crash is happening because the translator gets confused with the tensor layout when you chain two ReduceMean ops with axes 1 and -2.

Here’s what I’d try:

First option - normalize your axes and use keepdims:

  • Convert that -2 to its positive equivalent
  • Set keepdims=1 on both ReduceMean nodes to keep the tensor dimensions consistent
  • Throw in a Squeeze at the end if you need to drop those reduced dims

Better approach - just fuse them: Since you’re doing back-to-back reductions anyway, why not just combine the axes into one ReduceMean? Way cleaner and sidesteps the whole layout issue.

Also worth checking:

  • Make sure your axes are node attributes (int64s) instead of input tensors - the translator handles those way better
  • Try sticking with positive NHWC indices when you can (like {1,2} for spatial dims)
  • If all else fails, you could do Transpose → ReduceMean → Transpose but that’s probably overkill

The fused approach usually does the trick.
Let me know how it goes!