Model not working after conversion to hef

Hello,
I’m having issues after converting yolo v8s to hef.
When I test pt file using ultralytics predict, everything works perfectly, but after conversion, I want to use the file in frigate, but only rarely something shows up.

At the beginning, it was working fine with smaller dataset. It was not too good due to small dataset, but seemed to be 1:1 compared to ultralytics predict. I did not change anything in config. alls and nms json configs are unchanged from default.

imgsz 640
model yolov8s
dataset size 2028

model.train(data = "dataset_custom.yaml", imgsz = 640, batch = 8, epochs = 300, workers = 0, device = 0)

epoch 300 (after last run Best results observed at epoch 63)

hailomz compile yolov8s --ckpt=best.onnx --hw-arch hailo8l --calib-path train/images --classes 1 --performance

I’m kind of at a loss at this point, any idea what could be wrong?

I have exported to onnx with opset=11 parameter and adjusted alls as follows based on recommendations in this forums:

normalization1 = normalization([0.0, 0.0, 0.0], [255.0, 255.0, 255.0])
model_optimization_flavor(optimization_level=0, compression_level=0)
change_output_activation(conv42, sigmoid)
change_output_activation(conv53, sigmoid)
change_output_activation(conv63, sigmoid)
performance_param(compiler_optimization_level=max)
quantization_param([conv42, conv53, conv63], force_range_out=[0.0, 1.0])
model_optimization_config(calibration, batch_size=16, calibset_size=2028)
nms_postprocess("../../postprocess_config/yolov8s_nms_config.json", meta_arch=yolov8, engine=cpu)
performance_param(compiler_optimization_level=max)

Now model works in general, but its nowhere as good as pre-conversion and there are false positives.

I plan to try yolov8m to see if I get any better results.

Here is an example of how to use frigate on raspberrypi5 and hailo8

I have tried use yolov8n object detection model on frigate, and here is an example about how to train yolov8n please use the hailo 4.20 DFC :slight_smile:

I have seen those. It may work for coco example model, but not really for my custom model.
With adjusted alls I posted in my 2 post, it works. But I’m not getting accuracy I need and some false positives.

How do you use 4.20 with rpi image? frigate image is build with 4.19. From what I seen here, it is not recommended to mix versions.

I have been testing with yolov8m too, but I think that “m” is not necessary when “s” works great before conversion to hef.

So far I figured that raising optimization level above 1 will completely breaks detection.
Right now I’m trying changing one parameter at time to figure out what would be best alls parameters for me. But with no experience, this being my first attempt to create custom object detection model, it is taking really long time.

Also thinking about increasing my dataset size to get accuracy perfect on onnx detection.

Maybe you should train your own model and convert it,then intput your label.

what do you mean by inputting label after training and converting?

Different dateset will train different models and get different label.

I’m using my own custom dataset I labeled, without any pretraining data. it only have one label.

This is driving me crazy. Once it went well, but moment I added more labeled images it went crazy with absurd false positives again using exactly same settings.

Settings I used:

normalization1 = normalization([0.0, 0.0, 0.0], [255.0, 255.0, 255.0])
change_output_activation(conv58, sigmoid)
change_output_activation(conv71, sigmoid)
change_output_activation(conv83, sigmoid)
#model_optimization_config(calibration, batch_size=8)
model_optimization_flavor(optimization_level=1, compression_level=0, batch_size=8)
quantization_param([conv58, conv71, conv83], force_range_out=[0.0, 1.0])
post_quantization_optimization(finetune, policy=enabled, learning_rate=0.000025)
nms_postprocess("../../postprocess_config/yolov8m_nms_config.json", meta_arch=yolov8, engine=cpu)
performance_param(compiler_optimization_level=max)

Since I managed to get model working and issue is now about false positives, I will be opening new topic.