Hi,
First off, there is a slight assumption that I know what I am doing here, that’s a bit off 
I have no real knowledge of how these tools and software work, I am just able to fiddle them together, run commands that you ask of me, but thats about it.
Did you notice that I found another model that was not built for the hailo8l? The yolov8m?
#> ./repos/hailo/hailort/build/hailort/hailortcli/hailortcli parse-hef yolov8m.hef
Architecture HEF was compiled for: HAILO8
Network group name: yolov8m, Multi Context - Number of contexts: 3
Network name: yolov8m/yolov8m
VStream infos:
Input yolov8m/input_layer1 UINT8, NHWC(640x640x3)
Output yolov8m/yolov8_nms_postprocess FLOAT32, HAILO NMS BY CLASS(number of classes: 80, maximum bounding boxes per class: 100, maximum frame size: 160320)
Operation:
Op YOLOV8
Name: YOLOV8-Post-Process
Score threshold: 0.200
IoU threshold: 0.70
Classes: 80
Max bboxes per class: 100
Image height: 640
Image width: 640
I also tried with this one:
https://hailo-model-zoo.s3.eu-west-2.amazonaws.com/ModelZoo/Compiled/v2.17.0/hailo8/yolov8s.hef
#> ./repos/hailo/hailort/build/hailort/hailortcli/hailortcli run yolov8s.hef
Running streaming inference (yolov8s.hef):
Transform data: true
Type: auto
Quantized: true
Network yolov8s/yolov8s: 100% | 2454 | FPS: 490.19 | ETA: 00:00:00
Inference result:
Network group: yolov8s
Frames count: 2454
FPS: 490.21
Send Rate: 4818.94 Mbit/s
Recv Rate: 4788.82 Mbit/s
Which parses as:
#> ./repos/hailo/hailort/build/hailort/hailortcli/hailortcli parse-hef yolov8s.hef
Architecture HEF was compiled for: HAILO8
Network group name: yolov8s, Single Context
Network name: yolov8s/yolov8s
VStream infos:
Input yolov8s/input_layer1 UINT8, NHWC(640x640x3)
Output yolov8s/yolov8_nms_postprocess FLOAT32, HAILO NMS BY CLASS(number of classes: 80, maximum bounding boxes per class: 100, maximum frame size: 160320)
Operation:
Op YOLOV8
Name: YOLOV8-Post-Process
Score threshold: 0.200
IoU threshold: 0.70
Classes: 80
Max bboxes per class: 100
Image height: 640
Image width: 640
When it comes to my actual model run in frigate, I dont know how to extract it! Since if I enter the frigate container there are no hef files anywhere.
I am using frigate+, and just supply a model id a la “plus://”, and dont now how the model is acquired.
Also note that frigate container builds will match its hailo-drivers to what is supported in the HAOS, so currently I am stuck at v.4.21.0.
I did just notice though, that I was using the 640x640 model, and when I changed to the 320x320, inference times dropped to ~14ms. But if I should be happy with that I have no clue.
What I feel is MOST IMPORTANT though, is the fact that no-one seems to care about the build fix in #PR22, which is such a simple fix I cant understand why its not merged.
Without it, I cant build from more up2date code, from the real repo.
Br,
Taisto