How to modify lprnet.yaml file when compiling hailomz?

I wanted to train my own dataset using LPRNET based on this document.

There was no problem in establishing the Docker environment, so I used pre-trained weights to retrain the CCPD2020 data set (number of pictures 11776, total file 922MB) in the Docker environment and got Final_LPRNet_model.pth

python train_LPRNet.py --train_img_dirs /data/images/train --test_img_dirs /data/images/test --max_epoch 15 --train_batch_size 64 --test_batch_size 32 --resume_epoch 15 --pretrained_model pre_trained/lprnet.pth --save_folder runs /exp0/ --test_interval 2000

I then use export.py to convert pth to onnx

python export.py --onnx lprnet.onnx --weights Final_LPRNet_model.pth

I also created CCPD2020/calib images, the image resolution is 75x300

Download lprnet.yaml and modify network_path to lprnet.onnx

Finally I used hailomz to compile onnx to hef format

hailomz compile --ckpt lprnet.onnx --calib-path CCPD2020/calib --yaml lprnet.yaml

An error message appears as a result

[info] Fine Tune is done (completion time is 00:08:03.12)
[info] Starting Layer Noise Analysis
Full Quant Analysis:   0%|                                                                              | 0/2 [00:00<?, ?iterations/s]Traceback (most recent call last):
  File "/home/ubuntu/hailo/bin/hailomz", line 8, in <module>
    sys.exit(main())
...
    TypeError: Exception encountered when calling layer 'lat_model' (type LATModel).
    
    in user code:
    
        File "/home/ubuntu/hailo/lib/python3.10/site-packages/hailo_model_optimization/algorithms/lat_utils/lat_model.py", line 340, in call  *
            n_ancestors = self._native_model.flow.ancestors(lname)
        File "/home/ubuntu/hailo/lib/python3.10/site-packages/hailo_model_optimization/acceleras/model/hailo_model/model_flow.py", line 31, in ancestors  *
            return nx.ancestors(self, source)
    
        TypeError: outer_factory.<locals>.inner_factory.<locals>.tf__func() missing 1 required keyword-only argument: '__wrapper'
    
    
    Call arguments received by layer 'lat_model' (type LATModel):
      • inputs=tf.Tensor(shape=(8, 75, 300, 3), dtype=float32)

My ONNX parameters are as follows

I think the parameters of lprnet.yaml do not match ONNX. How can I find out?

Hey @ttypie ,

Welcome to the Hailo Community!

Troubleshooting Guide: LATModel Layer Error

Error Description

TypeError: outer_factory.<locals>.inner_factory.<locals>.tf__func() 
missing 1 required keyword-only argument: '__wrapper'

The error occurs during model compilation when the LATModel layer fails to receive required arguments. This typically indicates a compatibility issue between the layer implementation and Hailo’s compilation pipeline.

Resolution Approaches

1. Layer Compatibility Check

  • Review Hailo’s supported layer documentation
  • Note: LATModel is not explicitly listed as a supported layer type
  • Verify if using latest Hailo SDK version for potential updated support

2. Model Architecture Modifications

Option A: Layer Replacement

  • Replace LATModel with supported alternative layers:
    • Convolutional layers
    • Pooling layers
    • ReLU activation
    • Other standard neural network components

Option B: Layer Removal

  • Evaluate if LATModel is essential for model performance
  • Consider removing the layer if it’s used for non-critical optimizations
  • Test model accuracy after removal to ensure performance maintains

3. Advanced Debugging

Use enhanced logging for deeper analysis:

# Enable verbose output
hailomz compile --verbose

# Additional debug flags if available
hailomz compile --debug --log-level=DEBUG