For my custom semantic segmentation model, I am following the conversion pipeline from PyTorch to ONNX, HAR, HAR optimized, and finally HEF.
I checked that my torch model and one model produce the same outputs.
My torch model takes a normalized image as input in the dataloader (mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]) and produces an output.
In this case, when I optimize har with calib data with the command:
hailo optimize --hw-arch hailo8 --calib-set-path …/data_npy/calib_512_1500.npy PIDNet.har
should I use the normalized numpy array (created with the same normalization used with Pytorch) for the calibration data npy file rather than using the real image values in the [0, 255] range?
Note that my custom model doesn’t have an input normalization layer.
I proceeded this way, but the segmentation results from using my custom HEF are performing very poorly.
The best approach is to add the normalization to the model so it can be performed on the Hailo device. This can be done by adding a normalization command to the model script (.alls file). Please note that the mean and std values should be in int for the command (multiplied by 255).
I have created a callib_dataset.npy now i don’t know how to create the (model.alls) and what to write may i know by one example step by step so that i can write for any of the model out there in the onnx format.
Thanks
so how do i create this for any model available out there in open source.and anyother details file if needed for this step to create quantized har file