Yolov8-OBB usage on Hailo8

Hello Hailo-Communiy,

i have trained a custom Yolov8s-OBB model and converted it to onnx and then to .hef using this end nodes of the yolo model:
name: yolov8s/conv43 → tensor: ?[-1,240,240,64]

name: yolov8s/conv44 → tensor: ?[-1,240,240,3]

name: yolov8s/conv45 → tensor: ?[-1,240,240,1]

name: yolov8s/conv57 → tensor: ?[-1,120,120,64]

name: yolov8s/conv58 → tensor: ?[-1,120,120,3]

name: yolov8s/conv59 → tensor: ?[-1,120,120,1]

name: yolov8s/conv70 → tensor: ?[-1,60,60,64]

name: yolov8s/conv71 → tensor: ?[-1,60,60,3]

name: yolov8s/conv72 → tensor: ?[-1,60,60,1]

I used custom dequanitzaion for those 9 end nodes to extract the bounding boxes, confidence, class, … but i did not get any result which actually displays correct detections.

Is there any way / guide how to use OBB models on the Hailo8 chips? Because as far as i found out, there is currently no possibility for post processing the rotated bounding boxes on the hailo device.

Thank you in advance and best regards.

Hi @Simon_Wolf
See OBB example code here: Hailo-Application-Code-Examples/runtime/python/oriented_object_detection/README.md at main · hailo-ai/Hailo-Application-Code-Examples · GitHub

Hi @giladn,

i have reviewed the sample code but it does not really help me because its for yolov11-OBB. I tried to use the dequantization from the sample but i did not get any usable bounding box outputs from the model.

I am using the degirum library for my inference so i call the model like this:

model = dg.load_model(model_name=model_name, inference_host_address=inference_host_address, zoo_url=zoo_url, token=token, device_type=device_type)

If it helps, my output i get from the model looks like this:

{'data': array([[[140, 139, 115, ..., 102, 111, 108],
        [120, 124, 116, ..., 105, 110, 106],
        [123, 124, 111, ..., 104, 110, 101],
        ...,
        [109, 113, 112, ..., 109, 110, 108],
        [111, 114, 112, ..., 109, 111, 108],
        [118, 122, 112, ..., 110, 118, 109]]], dtype=uint8), 'id': 2, 'name': 'yolov8s/conv57', 'quantization': {'axis': -1, 'scale': [...], 'zero': [...]}, 'shape': [1, 14400, 64], 'size': 921600, 'type': 'DG_UINT8'}

And a value inside looks like this then:

yolov8s/conv44 quantization: {'axis': -1, 'scale': [0.41985028982162476], 'zero': [201]}

My problem here is that i am not able to deqantize the values in a way, that the hailo inference gives me the results which i got in my original yolo inference. I don´t know if thats a topic for this forum but I´ll try to find a solution for it and post it here then.

Hi @Simon_Wolf,

Since you are using degirium, maybe @shashi could help you here

Hi @nina-vilela,

I made my best efforts to get the dequantization and post processing working with my obb model. That worked better than expected but not good enought for my application.

What i did now is, I realized the application using a standart object detection model with a bit more classes and a finer post processing to get the angles i need.

Now I can run that model with my previous code on my Hailo8L and the inference results are perfect for my application.

Thank you very much for your support.

1 Like