GStreamer parallel inference

Hi all,

running parallel inference in GStreamer pipelines was quite easy in the past:
I was able to run this pipeline multiple times on the same device

gst-launch-1.0 -e videotestsrc ! hailonet hef-path=model/yolov7_tiny_anymos.hef is-active=true device-count=0  ! fakesink

Now (Vers . 4.17.1) the second pipeline i start runs into following error:

[HailoRT] [error] CHECK failed - Failed to create vdevice. there are not enough free devices. requested: 1, found: 0
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_OUT_OF_PHYSICAL_DEVICES(74)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_OUT_OF_PHYSICAL_DEVICES(74)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_OUT_OF_PHYSICAL_DEVICES(74)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_OUT_OF_PHYSICAL_DEVICES(74)
CHECK_EXPECTED_AS_STATUS failed with status=74

I did find this post How do I run multiple gstreamer pipeline in parallel? is it legit? Is hailort.service mandatory now to run inference in parallel?

We have had the Multi-Process Service since HailoRT v4.10.0 . I do not think the behavior has changed. Some software/service needs to manage the access to the hardware when you have multiple processes.
If you have multiple networks in a single process the model scheduler can handle that without the service.

Have a look at the HailoRT documentation available in the Developer Zone. In the online version you can easily compare the versions. Check the Running Inference section for description of all possibilities.

Hailo Developer Zone HailoRT v4.17.0

Thank you for your reply. It might be the issue that I ran multiple pipelines in the same process, which is why I did not need the service.

In order to run multiple pipelines in multiple processes I wrote a yocto recipe for hailort_service, since I couldn’t find any in meta-hailo.

Here are my files in case other people are having the same issue.
(tested on kirkstone)
hailortservice_4.17.1.bb

DESCRIPTION = "hailort_service - Multi-Process Service enables the ability to manage and share a \
               Hailo device between multiple processes, thus providing the ability to use multi-process inference"

LICENSE = "MIT"
LIC_FILES_CHKSUM = "file://hailort/LICENSE;md5=48b1c947c88868c23e4fb874890be6fc \
                    file://hailort/LICENSE-3RD-PARTY.md;md5=f491a052559dbcdae697362cd5a13c96"

SRC_URI = "git://git@github.com/hailo-ai/hailort.git;protocol=https;branch=master"
SRCREV = "e2190aeda847ab22057d162d08b516c39ac36ab8"

S = "${WORKDIR}/git"

inherit systemd
inherit hailort-base
inherit cmake

EXTRA_OECMAKE += "-DHAILO_BUILD_SERVICE=1"

RDEPENDS:${PN} += "libhailort"
OECMAKE_TARGET_COMPILE = "hailort_service"
SYSTEMD_SERVICE:${PN} = "hailort.service"

do_install:append() {

    install -d ${D}${systemd_unitdir}/system
    install -m 0644 ${S}/hailort/hailort_service/hailort.service ${D}${systemd_unitdir}/system/

    install -d ${D}/etc/default/
    install -m 0644 ${S}/hailort/hailort_service/hailort_service ${D}/etc/default/

    echo "d /var/log/hailo 0755 - - -" > ${S}/hailortservice.conf

    install -d ${D}/usr/lib/tmpfiles.d/
    install -m 0644 ${S}/hailortservice.conf ${D}/usr/lib/tmpfiles.d/

    install -d ${D}/usr/local/bin
    install -m 0755 ${S}/../bin/hailort_service ${D}/usr/local/bin
}

FILES:${PN} += " \
  ${base_libdir}/systemd \
  /usr/local/bin/hailort_service \
  /etc/default/hailort_service \
  /usr/lib/tmpfiles.d/hailortservice.conf \
    "

and libhailort_%.bbappend

EXTRA_OECMAKE += "-DHAILO_BUILD_SERVICE=1"
1 Like