Reshape layer error

According to Dataflow User Guide(p138) reshape is supported a tensor from (batch, height, 1, F) to (batch, height, W′, F’) where F = W’ * F’. But it seems there is something more beyond what is covered in the guide.

As I have posted I am still going through area-attention module and I guess there is some issue with reshape layer.

code

def forward(self, x):
        """Processes the input tensor 'x' through the area-attention."""
        B, C, H, W = x.shape
        N = H * W

        qk = self.qk(x).flatten(2)
        v = self.v(x)
        pp = self.pe(v)
        v = v.flatten(2)

        if self.area > 1:
            qk = qk.unsqueeze(2)
            print(qk.shape)
            qk = qk.reshape(1, C * 2, B * self.area, N // self.area)
            print(qk.shape)
            qk = qk.transpose(0, 2)
            v = v.unsqueeze(2)
            v = v.reshape(1, C, B * self.area, N // self.area)
            v = v.transpose(0, 2)
            B, _, _, N = qk.shape

the result of reshape format

torch.size([1, 512, 1, 16384])
torch.size([1, 512, 4, 4096])

But during parsing, the result becomes None which causes TypeError: object of type ‘NoneType’ has no len()

input [batch, channels, height, width]
/qk/conv/Conv [batch, channels, height, width]
/Reshape [1, 512, 16384] [batch, channels, height, width]
/Reshape [batch, channels, width]
/Unsqueeze [batch, channels, groups, width]
/Reshape_2 None

Are there any more stuff not documented in the guide? Any help is appreciated.

It seems every question related to reshape layer is being unanswered. Is asking reshape layer issue against the rules of community?
If so, I will stop asking the questions related to the layer. Hope I can be informed whether reshape is currently unsupported.

Hey @HappySniper95,

Thanks for your patience. Could you provide a Netron image of the ONNX or something similar so I can better diagnose your issue?

The problem you’re encountering relates to reshape operations in Hailo’s compiler. While basic reshaping from (batch, height, 1, F) to (batch, height, W′, F′) works when F = W′ * F′, there’s an important limitation:

Hailo’s compiler requires all reshape operations to have fully statically defined output shapes at compile-time. If any dimension is:

  • Derived dynamically from a prior operation
  • Inferred from runtime variables (like B * area or N // area)
  • Ambiguous due to transpositions or view chains

Then ONNX shape inference fails, and the Hailo parser treats it as None, leading to the error:

TypeError: object of type 'NoneType' has no len()

Looking at your code:

qk = self.qk(x).flatten(2)  # → [B, C, H*W]
qk = qk.unsqueeze(2)
qk = qk.reshape(1, C * 2, B * self.area, N // self.area)

This works in PyTorch but not in ONNX unless B, area, and N are constants during export. After ONNX export, the shape at /Reshape_2 becomes None.

The key difference is that PyTorch traces runtime shapes during export, but Hailo’s compiler parses ONNX statically. If any dimension is undefined, it cannot resolve reshape chains.

Try avoiding implicit reshapes with runtime-dependent expressions. Instead of:

qk = qk.reshape(1, C * 2, B * self.area, N // self.area)

Use hardcoded constants for test inputs:

qk = qk.reshape(1, 512, 4, 4096)

To help you further with your specific case, please share a Netron visualization or model name.

@omira
Thank you for your response. I sincerely appreciate your help.
I have tried using hardcoded constants for reshape layer but it still outputs None.
code:

qk = qk.reshape(1, 512, 4, 4096)
v = v.reshape(1, 256, 4, 4096)

result:

input [batch, channels, height, width]
/qk/conv/Conv [batch, channels, height, width]
/Reshape_2 None
/Reshape_2 None

Here is ONNX visualization

Are there any constraints that prevent certain layers(e.g. flatten layer) from being used before the reshape layer?

Hey @HappySniper95 ,

Thanks for sharing the graph , these are my thoughts :

Root Cause

Hailo’s Dataflow Compiler does not support shape propagation through runtime shape-calculation logic, even if the resulting values are constant at inference time.

So even this setup:

Shape → Slice → Concat → Reshape

Fails Hailo’s requirement of fully static Reshape.

Recommended Fix

Hardcode the shape using a Constant node upstream:

  • Make sure the shape input of the Reshape comes directly from a constant initializer, not any ops like Shape, Slice, etc.

Also verify in Netron that the reshape node’s second input is a solid constant and not part of a calculation chain.

Thanks. I will try out