Hello,
I am trying to run an RTMDet segmentation ONNX model on a Hailo8L.
Parsing works after some ONNX fixes, but `hailo optimize` fails during statistics collection with a Keras shape error.
Without a successful optimize/quantization pass, `hailo compiler` refuses to generate a HEF because there are no quantized weights.
## Environment
- Hailo DFC Version: 3.33.0
- HailoRT Version: 4.23.0
The starting point is an RTMDet-Seg ONNX model with input:
- `input`: `[1, 3, 736, 448]` (NCHW)
Initially, when running the parser, I got:
> `TypeError: The element type in the input tensor is not defined.`
> in the path: `onnx_graph.get_tile_repeats() -> numpy_helper.to_array`
The reason was dynamic `Tile` nodes (`/Tile_1`, `/Tile_2`) whose `repeats` were computed at runtime via:
- `Shape -> ConstantOfShape -> Expand -> Concat -> Tile`
Using ONNX Runtime, I evaluated the actual `repeats` values once:
- `/Concat_17_output_0` → `int64[4] = [1, 1, 1, 1]`
- `/Concat_19_output_0` → `int64[5] = [1, 19, 1, 1, 1]`
Then I modified the ONNX so that `/Tile_1` and `/Tile_2` take static int64 initializers as their `repeats` inputs.
With this change, the model .onnx parses successfully.
So I now have a valid HAR (`model.har`), and the recommended end nodes are: `/Sigmoid`, `/Tile_2`, `/Concat_7`, `/Concat_5`.
When I run `hailo optimize` on this HAR, the optimization/quantization fails in the statistics collection stage with a Keras error, both with random calibration and with a custom calibration set.
In both cases, I get the same error:
The shape of the target variable and the shape of the target value in variable.assign(value) must match. variable.shape=(1,), Received: value.shape=(6762,). Target variable: <KerasVariable shape=(1,),dtype=float32, path=mean_square_value_by_feature/accumulated_statistic>
Arguments received by ActivationOp.call(): • inputs=tf.Tensor(shape=(8, 1, 6762, 6762), dtype=float32) • fully_native=None • encoding_tensors=None • skip_stats=False • training=False • kwargs={‘cache_config’: ‘None’}
So it seems that `mean_square_value_by_feature` (or a similar stats routine) assumes a 1‑element variable, but receives a vector of length 6762, for a layer whose input is `(8, 1, 6762, 6762)`.
Questions
-
Is this **statistics shape error** (target `(1,)` vs. value `(6762,)` in `mean_square_value_by_feature/accumulated_statistic` with input `(8, 1, 6762, 6762)`) a known issue in the Hailo SDK version I’m using (DFC 3.33.0 / HailoRT 4.23.0)?
-
Is there any **workaround or internal configuration** to:
- change the statistics mode (e.g. per‑tensor instead of per‑feature),
- disable this specific statistics type for problematic layers, or
- otherwise run a simplified quantization flow that avoids this code path?
-
Is there a **recommended model script (`–model-script`, .all file)** configuration for models with large intermediate shapes like this, that would make the quantization statistics more robust?
-
If this is a bug that has already been fixed, could you point me to:
- the SDK version, or
- a patch / internal flag
that resolves this issue?