Multstream Inference on Hailo8 with Pi 5

I am trying tappas on Pi 5 and Hailo8, and then I find some example can run on Pi5, so I write the most useful wiki about multistream inference on hailo8.
Here is the wiki Link:

Here is the youtube link:

This example can be used to do like NVR ect.

We have added the ability to modify the batch size based on the new code, significantly improving the inference speed. According to the tests conducted on YOLOv8m with an input size of 640x640 and a batch size of 8, the inference speed for a video input of 720p is as follows:

Channel Count PCIE Gen2 Performance PCIE Gen3 Performance
1 channel stream 39.82FPS 76.99FPS
2 channels streams 19.86FPS 38.21FPS
4 channels streams 8.45FPS 16.94FPS
8 channels streams 3.85FPS 8.15FPS
12 channels streams 2.94FPS 5.43FPS