Issue: Docker container cannot access Hailo hardware for DeGirum inference
I’m encountering issues with running DeGirum models on Hailo hardware from within a Docker container on a Raspberry Pi 5 with Hailo AI Hat.
Environment
- Hardware: Raspberry Pi 5 with Hailo AI Hat
- Models: DeGirum face detection and segmentation models (specifically built for Hailo)
- Python version: 3.9.6
Problem Description
While the DeGirum models work perfectly when running my application directly on the Raspberry Pi, I receive the following error when trying to use them from within a Docker container:
degirum.exceptions.DegirumException: Model 'scrfd_10g--640x640_quant_hailort_hailo8l_1' does not have any supported runtime/device combinations that will work on this system
What I’ve tried
I attempted to follow the solution from this community thread by building HailoRT within my Docker container:
RUN mkdir -p hailort
RUN git clone https://github.com/hailo-ai/hailort.git hailort/sources
RUN cd hailort/sources && git checkout v4.18.0
RUN cd hailort/sources && cmake -S. -Bbuild -DCMAKE_BUILD_TYPE=Release -DHAILO_BUILD_EXAMPLES=1 && sudo cmake --build build --config release --target install
But the build fails with error:
Build files have been written to: /app/hailort/sources/build
The command '/bin/sh -c cd hailort/sources && cmake -S. -Bbuild [...] --target install' returned a non-zero code: 127
ERROR: Service 'scheduler' failed to build : Build failed
I’ve also installed the correct HailoRT wheel file for Python 3.9:
COPY ./static/wheel/hailort-4.21.0-cp39-cp39-linux_aarch64.whl /setup/hailort-4.21.0-cp39-cp39-linux_aarch64.whl
RUN pip install /setup/hailort-4.21.0-cp39-cp39-linux_aarch64.whl
Could someone please advise on how to properly configure Docker to access the Hailo hardware and run DeGirum models from within a container?