Hey @michael.nilsson ,
I understand that the issue with bounding boxes exceeding the [0, 1] range is happening specifically with the 1088x1088 configuration while working fine at 640x640. Let me help you resolve this.
Understanding the Issue
The problem typically occurs when there’s a mismatch between:
- Training preprocessing and inference preprocessing
- How the model was compiled for the target resolution
- How padding and scaling are handled during inference
Solution Steps
1. Check Your Training Preprocessing
If you’re using YOLOv8, verify your preprocessing matches this pattern:
from ultralytics.yolo.utils import ops
def preprocess(image, target_size=1088):
# Resize and pad while maintaining aspect ratio
image, ratio, (dw, dh) = ops.letterbox(image, new_shape=(target_size, target_size))
# Normalize to [0, 1]
image = image / 255.0
return image, ratio, dw, dh
2. Adjust Your Compilation Command
Remove the --resize
parameter if your model was specifically trained for 1088x1088:
hailomz compile yolov8n --ckpt yolov8n_1088.onnx \
--hw-arch hailo8l \
--calib-path dataset_only_20_1088 \
--classes 8
3. Handle Output Scaling
If you’re using padding (letterboxing), make sure to adjust the bounding boxes:
def scale_boxes(img_shape, boxes, ratio, dw, dh):
# Adjust for padding
boxes[:, [0, 2]] -= dw # x-coordinates
boxes[:, [1, 3]] -= dh # y-coordinates
# Scale to original size
boxes[:, [0, 2]] /= ratio
boxes[:, [1, 3]] /= ratio
return boxes
If you’re still experiencing issues after trying these steps, please share:
- Your training preprocessing code
- The exact compilation command you’re using
- A sample of the problematic output values
Best regards,
Omria