I am trying to fine-tune a CNN for RBG images. To do so, I am using `train_encoding` to optimize the weights and biases of the layers. However, it fails on that because it says `you cannot mix tensors and non-tensors`. The given list of mixed tensors and non-tensors only include `avgpool` layers and its operations. Yet, when I pass the shown layers to `layers_to_freeze`, it still keeps failing.
Use pre‑quantization optimizations for problematic avgpools
For avgpool layers that cause quantization/encoding issues (especially global or near‑global pools), the recommended approach is to transform them before quantization, e.g.:
If your failing layers are global or large‑kernel avgpools, applying such a pre‑quantization optimization beforetrain_encoding may avoid the internal inconsistency that leads to the tensor/non‑tensor mix error.
Confirm layers_to_freeze usage
The docs define layers_to_freeze as:
Freeze (don’t modify weights&biases for) any layer whose name includes one of this list as a substring.
You must pass substrings that actually appear in the Hailo‑HN layer names (e.g. avgpool1, avgpool), not framework‑side names.
Freezing a layer only prevents its weights/biases from being updated; it doesn’t change how its tensors are represented in the graph. So if the error is about mixing tensor/non‑tensor objects in the computation graph, freezing alone may not resolve it.