I wanted to train my own dataset using LPRNET based on this document.
There was no problem in establishing the Docker environment, so I used pre-trained weights to retrain the CCPD2020 data set (number of pictures 11776, total file 922MB) in the Docker environment and got Final_LPRNet_model.pth
python train_LPRNet.py --train_img_dirs /data/images/train --test_img_dirs /data/images/test --max_epoch 15 --train_batch_size 64 --test_batch_size 32 --resume_epoch 15 --pretrained_model pre_trained/lprnet.pth --save_folder runs /exp0/ --test_interval 2000
I then use export.py to convert pth to onnx
python export.py --onnx lprnet.onnx --weights Final_LPRNet_model.pth
I also created CCPD2020/calib images, the image resolution is 75x300
Download lprnet.yaml and modify network_path to lprnet.onnx
Finally I used hailomz to compile onnx to hef format
hailomz compile --ckpt lprnet.onnx --calib-path CCPD2020/calib --yaml lprnet.yaml
An error message appears as a result
[info] Fine Tune is done (completion time is 00:08:03.12)
[info] Starting Layer Noise Analysis
Full Quant Analysis: 0%| | 0/2 [00:00<?, ?iterations/s]Traceback (most recent call last):
File "/home/ubuntu/hailo/bin/hailomz", line 8, in <module>
sys.exit(main())
...
TypeError: Exception encountered when calling layer 'lat_model' (type LATModel).
in user code:
File "/home/ubuntu/hailo/lib/python3.10/site-packages/hailo_model_optimization/algorithms/lat_utils/lat_model.py", line 340, in call *
n_ancestors = self._native_model.flow.ancestors(lname)
File "/home/ubuntu/hailo/lib/python3.10/site-packages/hailo_model_optimization/acceleras/model/hailo_model/model_flow.py", line 31, in ancestors *
return nx.ancestors(self, source)
TypeError: outer_factory.<locals>.inner_factory.<locals>.tf__func() missing 1 required keyword-only argument: '__wrapper'
Call arguments received by layer 'lat_model' (type LATModel):
• inputs=tf.Tensor(shape=(8, 75, 300, 3), dtype=float32)
My ONNX parameters are as follows
I think the parameters of lprnet.yaml do not match ONNX. How can I find out?