Running inference on Hailo-8 M.2 accelerator module using 4 lane PCIe interface

The results available on Hailo model explorer about fps for various models are in accordance to x2 lane PCIe interface but on pi 5 we can run it for single lane interface. So I want to test the Hailo-8 module with x2 lane and also x4 lane PCIe interface and see the fps speed that can be achieved. For this I will have to use that module in some PC which has M.2 slot but will that be enough particularly with respect to necessary drivers for running inference on this module, are they available for other systems such as x86 architecture on windows, linux or other OS ? Has anyone done sucv testing before ?

Hi @Sameer_Nilkhan,

The performance numbers reported in the Model Explorer and in the Hailo Model Zoo were obtained on a machine vased on the Intel® Core™ i5-9400 CPU, with the Hailo-8 connected to it. The configuration was PCIe Gen 3.0 x4 lanes.
If you are using a different host system with a lower number of PCIe lanes, we recommend testing the performance of the model using hailortcli run or hailortcli run2 commands.
To run such tests, you first need to install two software components:

.DEB or .MSI files can be downloaded from the Hailo Developer Zone (for Ubuntu 22/24 or Windows 10/11). For other OS (e.g. Android), you have to build library/driver from scratch using the sources from GitHub. Please refer to the HailoRT User Guide for installation instructions (you have to access the Developer Zone before clicking the link).
Important: for Hailo-8, use version 4.x (latest is 4.23.0)