After building hef model, how do I utilize it?

I followed the dataflow compiler tutorial and jupyter notebooks on my main rig, those examples worked. I used them to take this model: https://github.com/ultralytics/yolov5/releases/download/v7.0/yolov5m.onnx to a hef file for the hailo 8l.

I am able to run on the rpi5:

matteius@matteius-desktop:~/Projects/tappas/apps/h8/gstreamer/raspberrypi/detection$ hailortcli run ~/yolov5m.hef 
Running streaming inference (/home/matteius/yolov5m.hef):
  Transform data: true
    Type:      auto
    Quantized: true
Network yolov5m/yolov5m: 100% | 81 | FPS: 16.20 | ETA: 00:00:00
> Inference result:
 Network group: yolov5m
    Frames count: 81
    FPS: 16.20
    Send Rate: 159.22 Mbit/s
    Recv Rate: 278.63 Mbit/s

But my question is how do I actually utilize the model? I tired modifying the detection.sh example:

diff --git a/apps/h8/gstreamer/raspberrypi/detection/detection.sh b/apps/h8/gstreamer/raspberrypi/detection/detection.sh
index 1358cf9..1a3e452 100755
--- a/apps/h8/gstreamer/raspberrypi/detection/detection.sh
+++ b/apps/h8/gstreamer/raspberrypi/detection/detection.sh
@@ -74,7 +74,11 @@ function parse_args() {
                 hef_path="$RESOURCES_DIR/ssd_mobilenet_v1.hef"
                 postprocess_so="$POSTPROCESS_DIR/libmobilenet_ssd_post.so"
                 thresholds_str=""
-            elif [ $2 != "yolov5" ]; then
+            elif [ $2 == "yolov5" ]; then
+               network_name="hailo_yolo_inference"
+               hef_path="/usr/share/hailo-models/yolov5m.hef"
+               postprocess_so="/usr/local/lib/aarch64-linux-gnu/rpicam-apps-postproc/libyolo_hailortpp_post.so"
+            else
                 echo "Received invalid network: $2. See expected arguments below:"
                 print_usage
                 exit 1

matteius@matteius-desktop:~/Projects/tappas/apps/h8/gstreamer/raspberrypi/detection$ ./detection.sh --network yolov5
Running hailo_yolo_inference
gst-launch-1.0 filesrc location=/home/matteius/Projects/tappas/apps/h8/gstreamer/raspberrypi/detection/resources/detection.mp4 name=src_0 ! qtdemux ! h264parse ! avdec_h264 max_threads=2 ! queue max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! videoscale n-threads=2 ! queue max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! videoconvert n-threads=3 ! queue max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! hailonet hef-path=/usr/share/hailo-models/yolov5m.hef batch-size=1 output-format-type=HAILO_FORMAT_TYPE_FLOAT32 nms-score-threshold=0.3 nms-iou-threshold=0.45 output-format-type=HAILO_FORMAT_TYPE_FLOAT32 ! queue max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! hailofilter function-name=hailo_yolo_inference so-path=/usr/local/lib/aarch64-linux-gnu/rpicam-apps-postproc/libyolo_hailortpp_post.so qos=false ! queue max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! hailooverlay ! queue max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! videoconvert n-threads=3 ! fpsdisplaysink video-sink=ximagesink name=hailo_display sync=false text-overlay=false
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Redistribute latency...
Redistribute latency...
Redistribute latency...
NMS score threshold is set, but there is no NMS output in this model.
CHECK_SUCCESS failed with status=6
NMS score threshold is set, but there is no NMS output in this model.
CHECK_SUCCESS failed with status=6
NMS score threshold is set, but there is no NMS output in this model.
CHECK_SUCCESS failed with status=6
NMS score threshold is set, but there is no NMS output in this model.
CHECK_SUCCESS failed with status=6
NMS score threshold is set, but there is no NMS output in this model.
CHECK_SUCCESS failed with status=6
NMS score threshold is set, but there is no NMS output in this model.
CHECK_SUCCESS failed with status=6
X Error of failed request:  BadValue (integer parameter out of range for operation)
  Major opcode of failed request:  131 (XInputExtension)
  Minor opcode of failed request:  46 ()
  Value in failed request:  0xd
  Serial number of failed request:  61
  Current serial number in output stream:  65
NMS score threshold is set, but there is no NMS output in this model.
CHECK_SUCCESS failed with status=6
NMS score threshold is set, but there is no NMS output in this model.
CHECK_SUCCESS failed with status=6

Then when I take that gstreamer command and remove the nms output args I get:

matteius@matteius-desktop:~/Projects/tappas/apps/h8/gstreamer/raspberrypi/detection$ ./detection.sh --network yolov5
Running hailo_yolo_inference
gst-launch-1.0 filesrc location=/home/matteius/Projects/tappas/apps/h8/gstreamer/raspberrypi/detection/resources/detection.mp4 name=src_0 ! qtdemux ! h264parse ! avdec_h264 max_threads=2 ! queue max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! videoscale n-threads=2 ! queue max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! videoconvert n-threads=3 ! queue max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! hailonet hef-path=/usr/share/hailo-models/yolov5m.hef batch-size=1 output-format-type=HAILO_FORMAT_TYPE_FLOAT32 output-format-type=HAILO_FORMAT_TYPE_FLOAT32 ! queue max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! hailofilter function-name=hailo_yolo_inference so-path=/usr/local/lib/aarch64-linux-gnu/rpicam-apps-postproc/libyolo_hailortpp_post.so qos=false ! queue max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! hailooverlay ! queue max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! videoconvert n-threads=3 ! fpsdisplaysink video-sink=ximagesink name=hailo_display sync=false text-overlay=false
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Redistribute latency...
Redistribute latency...
Redistribute latency...
X Error of failed request:  BadValue (integer parameter out of range for operation)
  Major opcode of failed request:  131 (XInputExtension)
  Minor opcode of failed request:  46 ()
  Value in failed request:  0xd
  Serial number of failed request:  61
  Current serial number in output stream:  65

I’ve also been following this tutorial: tappas/docs/write_your_own_application/write-your-own-postprocess.rst at master · hailo-ai/tappas · GitHub

With the hef I compiled, I can do the first part with the compile post process, but once I go to the second step I get similar errors to as above:

$ gst-launch-1.0 filesrc location=$TAPPAS_WORKSPACE/apps/h8/gstreamer/general/detection/resources/detection.mp4 name=src_0 ! decodebin ! videoscale ! video/x-raw, pixel-aspect-ratio=1/1 ! videoconvert ! queue ! hailonet hef-path=/home/matteius/yolov5m.hef is-active=true ! queue leaky=no max-size-buffers=30 max-size-bytes=0 max-size-time=0 ! hailofilter so-path=$TAPPAS_WORKSPACE/apps/h8/gstreamer/libs/post_processes/libmy_post.so qos=false ! videoconvert ! fpsdisplaysink video-sink=ximagesink name=hailo_display sync=true text-overlay=false
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
X Error of failed request:  BadValue (integer parameter out of range for operation)
  Major opcode of failed request:  131 (XInputExtension)
  Minor opcode of failed request:  46 ()
  Value in failed request:  0xd
  Serial number of failed request:  61
  Current serial number in output stream:  65

Additionally, I get this with other hefs as well, but the reference hef from the example is not built for the 8l so its not possible to try running against that with $TAPPAS_WORKSPACE/apps/h8/gstreamer/general/detection/resources/yolov5m_wo_spp_60p.hef

Similarly, this morning I tried out hailo_model_zoo to build yolov8s for the hailo8l. The run example works:

$ hailortcli run yolov8s.hef 
Running streaming inference (yolov8s.hef):
  Transform data: true
    Type:      auto
    Quantized: true
Network yolov8s/yolov8s: 100% | 123 | FPS: 24.59 | ETA: 00:00:00
> Inference result:
 Network group: yolov8s
    Frames count: 123
    FPS: 24.59
    Send Rate: 241.77 Mbit/s
    Recv Rate: 240.26 Mbit/s

However when I try utilizing the model in my pipeline which should just print the tensors:

$ gst-launch-1.0 filesrc location=$TAPPAS_WORKSPACE/apps/h8/gstreamer/general/detection/resources/detection.mp4 name=src_0 ! decodebin ! videoscale ! video/x-raw, pixel-aspect-ratio=1/1 ! videoconvert ! queue ! hailonet hef-path=/home/matteius/yolov8s.hef is-active=true ! queue leaky=no max-size-buffers=30 max-size-bytes=0 max-size-time=0 ! hailofilter so-path=$TAPPAS_WORKSPACE/apps/h8/gstreamer/libs/post_processes/libmy_post.so qos=false ! videoconvert ! fpsdisplaysink video-sink=ximagesink name=hailo_display sync=true text-overlay=false
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Redistribute latency...
Redistribute latency...
Redistribute latency...
Redistribute latency...
X Error of failed request:  BadValue (integer parameter out of range for operation)
  Major opcode of failed request:  131 (XInputExtension)
  Minor opcode of failed request:  46 ()
  Value in failed request:  0xd
  Serial number of failed request:  61
  Current serial number in output stream:  65
Caught SIGSEGV

I got a bit further this morning – I traced back that XInputExtension error for some reason having to do with video-sink=ximagesink – when I switch that to video-sink=autovideosink I am actually able to try the model I built against the sample video.

Hey @matt ,

Great to hear that you were able to resolve the issue!

To check if the ximagesink is working on your Raspberry Pi, you can run the following command:

gst-launch-1.0 videotestsrc ! ximagesink

If you encounter an error with this command, then you’ll need to check the image sink or the X11 tensor in your Raspberry Pi.

I would also recommend running the following command to check if the video is working:

gst-launch-1.0 videotestsrc ! autovideosink

This will help you verify that the video pipeline is functioning correctly on your Raspberry Pi. Let me know if you have any other questions or if you need further assistance!