Hello there,
Hardware: Raspberry Pi 5 with Hailo AI Hat (Hailo8L)
Operating System: Raspberry Pi OS (64-bit), Debian GNU/Linux 12 (Bookworm)
Hailo Version: 4.18.0
I encountered a frozen sink issue with the Hailo GStreamer pipelines.
I modified the detection example pipeline in the hailo-rpi5-examples repository to receive multiple RTSP streams as inputs (using Hailo RoundRobin as the multiplexer). Here is the pipeline string:
rtspsrc location=rtsp://uri1 name=src_0 message-forward=true ! rtph264depay ! queue name=hailo_preprocess_q_0 leaky=no max-size-buffers=100 max-size-bytes=0 max-size-time=0 ! decodebin ! queue leaky=downstream max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! videoscale n-threads=8 ! video/x-raw,pixel-aspect-ratio=1/1 ! videoconvert n-threads=8 ! video/x-raw,pixel-aspect-ratio=1/1 ! roundrobin.sink_0 hailo_streamrouter.src_0 ! queue name=comp_q_0 leaky=no max-size-buffers=100 max-size-bytes=0 max-size-time=0 ! comp.sink_0 rtspsrc location=rtsp://uri2 name=src_1 message-forward=true ! rtph264depay ! queue name=hailo_preprocess_q_1 leaky=no max-size-buffers=100 max-size-bytes=0 max-size-time=0 ! decodebin ! queue leaky=downstream max-size-buffers=5 max-size-bytes=0 max-size-time=0 ! videoscale n-threads=8 ! video/x-raw,pixel-aspect-ratio=1/1 ! videoconvert n-threads=8 ! video/x-raw,pixel-aspect-ratio=1/1 ! roundrobin.sink_1 hailo_streamrouter.src_1 ! queue name=comp_q_1 leaky=no max-size-buffers=100 max-size-bytes=0 max-size-time=0 ! comp.sink_1 hailoroundrobin mode=2 name=roundrobin ! queue name=inference_pre_infer_q_0 leaky=no max-size-buffers=100 max-size-bytes=0 max-size-time=0 ! hailonet hef-path=../resources/yolov8s_h8l.hef nms-score-threshold=0.3 nms-iou-threshold=0.45 output-format-type=HAILO_FORMAT_TYPE_FLOAT32 ! queue name=inference_postprocess0 leaky=no max-size-buffers=100 max-size-bytes=0 max-size-time=0 ! hailofilter so-path=/path/to/hailo/tappas/post_processes/libyolo_hailortpp_post.so qos=false ! queue name=identity_callback_q leaky=no max-size-buffers=100 max-size-bytes=0 max-size-time=0 ! identity name=identity_callback ! hailostreamrouter name=hailo_streamrouter src_0::input-streams="<sink_0>" src_1::input-streams="<sink_1>" compositor name=comp start-time-selection=0 sink_0::xpos=0 sink_0::ypos=0 sink_1::xpos=640 sink_1::ypos=0 sink_2::xpos=1280 sink_2::ypos=0 sink_3::xpos=1920 sink_3::ypos=0 sink_4::xpos=0 sink_4::ypos=640 sink_5::xpos=640 sink_5::ypos=640 sink_6::xpos=1280 sink_6::ypos=640 sink_7::xpos=1920 sink_7::ypos=640 ! queue name=hailo_display_videoconvert_q leaky=no max-size-buffers=100 max-size-bytes=0 max-size-time=0 ! videoconvert name=hailo_display_videoconvert n-threads=2 qos=false ! videoscale ! queue name=hailo_display_q leaky=no max-size-buffers=100 max-size-bytes=0 max-size-time=0 ! autovideosink sync=true
It works, but when the number of RTSP sources increases and exceeds 5 streams, the output freezes and only updates once every few seconds.
May I know if this is expected (e.g., due to hardware limitations), or if there is anything I might have missed in my pipeline?
Additionally, I built two other pipelines for person re-identification (re-ID) and cascaded network (detection + classification). Both pipelines experience the same issue, but this time they freeze even with just 2 input file sources or RTSP sources.
Interestingly, the degree/symptoms of freezing differ in each pipeline:
- Multi-stream Detection Pipeline: When there are more than 4 input RTSP streams, the streams “take turns” to freeze.
- Two-stream Re-ID Pipeline: The first stream runs smoothly, but the second stream freezes.
- Two-stream Cascaded Network Pipeline: Both streams freeze after a few seconds (usually around 2-3 seconds).
In the third case, I noticed something unusual. I implemented a custom Python module to print()
the detection and classification results and used the HailoPython plugin to call the module. Surprisingly, even though the output sink (I tried both fpsdisplaysink
and rtspclientsink
) freezes, the detection and classification results are still being updated and printed in the terminal. This indicates that the inference (for both detection and classification models) is running smoothly, but at some point after the hailonet
, the pipeline begins to freeze.
There are only a few elements after the hailonet
element: hailooverlay
, streamrouter
, compositor
, videoconvert
, videoscale
, and a sink (fspdisplaysink
or rtspclientsink
). Among these elements, I tried disabling the hailooverlay
, but it did not help.
I am happy to share the pipeline strings and my custom libraries with your developers for debugging purposes. Please let me know if you require any additional information.