Hey @sslee1 ,
The issue arises from a tensor shape mismatch during an ElementWise operation—specifically at the node ew_mult1
during compilation.
From your model graph, I can see that ew_mult1
receives two inputs:
- One input has the shape
[-1, 1, 16384, 24]
- The other, coming from a
resize
operation, has shape [-1, 1, 1, 24]
This smaller tensor is likely being broadcasted implicitly in your original framework (TensorFlow or PyTorch), which is supported by those runtimes. However:
The DFC does not support automatic broadcasting in all scenarios, particularly when spatial dimensions differ. It requires explicit shape alignment between tensors in elementwise operations.
To resolve this, you should explicitly reshape or tile the [1, 1, 24]
tensor to match [1, 16384, 24]
before the ew_mult1
operation.
Here’s how to fix it in your model before exporting to ONNX:
For TensorFlow:
# If tensor shape is [B, 1, 1, 24]
x = tf.tile(x, [1, 1, 16384, 1]) # Shape becomes [B, 1, 16384, 24]
For PyTorch:
# If tensor shape is [B, 1, 1, 24]
x = x.expand(-1, 1, 16384, -1) # Or use repeat: x = x.repeat(1, 1, 16384, 1)
Make sure this reshape aligns with your model’s logic (e.g. applying a per-channel scale across spatial positions).
You should re exporting the ONNX and re Parsing you’re model.
This should allow the compiler to proceed without triggering shape mismatch errors.