Fun with Raspberry Pi 5 with 104 TOPS

Fun with Raspberry Pi 5 with 104 TOPS

Hello everyone,

I recently decided to have some fun with my Raspberry Pi 5 by adding a Lanner Falcon-H8. My version of the Falcon-H8 has four Hailo-8 AI accelerators providing a total of 104 TOPS. To make this work, I used a Raspberry Pi HAT from Pineboards.

Here’s an image of my setup (without the heatsink):

As you can see, the Raspberry Pi 5 is connected to the Falcon-H8 PCIe card via the Pineboards HAT. The HAT makes it easy to integrate the PCIe card with the Raspberry Pi because the PCIe slot is open-ended and has a separate power connector, so it will not fry your Pi. I used a 12V 5A power supply with a barrel jack connector.

I have been running multiple instances of the applications from our Hailo RPi 5 Example repository.

Here’s an image of two Object Detection and two Pose Estimation examples in action:

You can see the four devices in the HailoRT CLI Monitor.

While this setup is primarily for fun and experimentation, it showcases the potential of using Lanner’s Falcon PCIe cards in more powerful systems. If you’re interested in running multiple AI tasks on many camera streams, you might want to check out the Falcon family of PCIe cards.

The Falcon-H8 accommodates 4, 5, or 6 Hailo-8 AI processors and a PCIe switch.

Lanner Falcon-H8

The Falcon Lite accommodates 1, 2, or 4 Hailo-8 AI processors and is available in different configurations.

Lanner Falcon Lite

Here is a link to the Pineboards HAT.

Pineboards Hat uPCIty Lite for Raspberry Pi 5

Feel free to follow the links to learn more about these products and how they might fit into your projects. If you have any questions or need further details, don’t hesitate to ask.

Happy experimenting!

2 Likes

Would this be a setup capable of running LLMs?

For large language models we have designed our next generation accelerator called Hailo-10H.

Hailo-10H M.2 Generative AI Acceleration Module

As you can see from the image the module has a external DDR memory on board. The memory will allow the Hailo-10H to run LLMs which are much larger than CNNs locally on the module without stressing the PCIe interface to the host.

The Hailo-10H is currently not yet generally available. We are working on the software support to ensure we provide a complete solution.

3 Likes

Hi @klausk This looks very interesting. We have been looking for a way to run our custom models on multiple Hailo8Ls using a raspberry Pi 5. We have converted the models and can successfully run the pipeline on a raspberry pi AI Hat+ and now want to improve performance by paralyzing execution across multiple Hailo devices.

Will we be able to use one of the raspberry PCI expansion boards, for example this quad board.? This one also allows use of an external power supply and is more compact than Lanner setup. Do you know if multiple Hailo8L’s can be used with this board? Are there any general restrictions with using multiple Hailo8L on a raspberry pi 5?

Thanks for any insights