Development workflow Hailo 15H

Hello together

I have a few questions regarding the development workflow when working with a Hailo15 platform, such as the Hailo15 SBC. I am unsure as to how to start developing custom apps, specifically what the best way is for development and testing.

My setup is as follows:

  • Hailo15H SBC with two LeopardImaging cameras
  • Development machine with Ubuntu 22.04

I want to develop both Python and C++ applications. For Python applications, I guess I can directly work on the SBC. C++ applications however need to be compiled.

I would like to use my Linux host as a development machine and cross-compile C++ applications for the Hailo15 SBC. I downloaded the Software Suite Docker File and launched the Docker container.

Now however I am stuck. For experimentation, I wanted to recompile the ai_example_app but failed. There is some documentation about cross compilation here (tappas/tools/cross_compiler at master · hailo-ai/tappas · GitHub), but I am unsure if this is the correct way to do it.

Could someone explain for me in simple terms the steps I need to take in order to develop my own Python and C++ applications for the Hailo15 SBC on a Linux development host?

Thank you very much, kind regards
Steve

1 Like

The Hailo AI Software Suite Docker is used to convert models from TFLite and ONNX format into HEF for Hailo-8 and Hailo-15.

The SDK is part of the Yocto build. You will find a prebuild SDK in the Hailo-15 Vision Processor SW package.
When you build your own Yocto image you can build the SDK yourself. This is described in the Hailo OS user guide.

I would recommend using C++ as much as possible to reach good performance. The Cortex-A53 processor in the Hailo-15 was designed for efficiency and not for maximum performance.

Thank you very much for your reply

I have downloaded the Hailo-15 Vision Processor SW package and located the SDK. I was also able to build a sample application, just a hello-world, on my host and run it on the HAILO 15 SBC. So far so good…

I would now like to start developing my vision application, starting by recompiling and modifying the ai_example_app in the tappas repo. I try to follow documentation. I would like to use the docker container instead of a native installation.

In the tappas master branch, not the master-vpu, I find the documentation on using a prebuilt docker container. When trying to download the corresponding files in the dev zone, the following is stated when trying to download tappas:

For vision processors, the TAPPAS package is included in the Vision Processors Software Package, to be compiled into the application processor’s image.

However, in the mentioned package, I find no reference to Tappas. I then downloaded the TAPPAS – Docker for x86_64 files and launched the docker container as explained here. As described here, I then copied the toolchain script from the Vision Processor Software Package into the running docker container and tried to compile the apps using the cross_compile_tappas script

./cross_compile_tappas.py aarch64 hailo15 debug /local/workspace/tappas/sdk/ --build-lib apps

I first had to edit the script to use python3 instead of python, as there seems to be no python binary or symlink.

After editing the scrtipt, the toolchain seems to be installed correctly, when meson tries to setup the project, the following error occurs:

INFO:/local/workspace/tappas/tools/cross_compiler/common.py:Starting the installation of the toolchain
INFO:/local/workspace/tappas/tools/cross_compiler/common.py:installing /local/workspace/tappas/sdk/poky-glibc-x86_64-core-image-minimal-armv8a-hailo15-sbc-toolchain-4.0.23.sh
INFO:/local/workspace/tappas/tools/cross_compiler/common.py:Toolchain ready to use (/local/workspace/tappas/sdk)
INFO:/local/workspace/tappas/tools/cross_compiler/common.py:Building gsthailotools plugins and post processes
Build dir: /local/workspace/tappas/tools/cross_compiler/aarch64-gsthailotools-build-debug/hailo
INFO:/local/workspace/tappas/tools/cross_compiler/common.py:Running Meson build.
meson-wrapper: Implicit setup command assumed
meson-wrapper: Running meson with setup options: " --cross-file=/local/workspace/tappas/sdk/sysroots/x86_64-pokysdk-linux/usr/share/meson/aarch64-poky-linux-meson.cross --native-file=/local/workspace/tappas/sdk/sysroots/x86_64-pokysdk-linux/usr/share/meson/meson.native "
The Meson build system
Version: 0.61.3
Source dir: /local/workspace/tappas/core/hailo
Build dir: /local/workspace/tappas/tools/cross_compiler/aarch64-gsthailotools-build-debug/hailo
Build type: cross build

meson.build:3:0: ERROR: Unknown options: "include_unit_tests"

A full log can be found at /local/workspace/tappas/tools/cross_compiler/aarch64-gsthailotools-build-debug/hailo/meson-logs/meson-log.txt
Traceback (most recent call last):
  File "/local/workspace/tappas/tools/cross_compiler/./cross_compile_tappas.py", line 98, in <module>
    gst_installer.build()
  File "/local/workspace/tappas/tools/cross_compiler/common.py", line 285, in build
    self.run_meson_build_command(env)
  File "/local/workspace/tappas/tools/cross_compiler/common.py", line 208, in run_meson_build_command
    self._runner.run(build_cmd, env=env, print_output=True)
  File "/local/workspace/tappas/tools/cross_compiler/common.py", line 113, in run
    p.check_returncode()
  File "/usr/lib/python3.10/subprocess.py", line 457, in check_returncode
    raise CalledProcessError(self.returncode, self.args, self.stdout,
subprocess.CalledProcessError: Command '['meson', '/local/workspace/tappas/tools/cross_compiler/aarch64-gsthailotools-build-debug/hailo', '--buildtype', 'debug', '-Dlibargs=-I/local/workspace/tappas/sdk/sysroots/aarch64-poky-linux/usr/include/hailort,-I/local/workspace/tappas/sdk/sysroots/aarch64-poky-linux/usr/include/gst-hailo/metadata,-std=c++17', '-Dprefix=/local/workspace/tappas/sdk/sysroots/aarch64-poky-linux/usr', '-Dinclude_blas=false', '-Dtarget_platform=hailo15', '-Dtarget=apps', '-Dlibxtensor=/local/workspace/tappas/core/open_source/xtensor_stack/base', '-Dlibblas=/local/workspace/tappas/core/open_source/xtensor_stack/blas', '-Dlibcxxopts=/local/workspace/tappas/core/open_source/cxxopts', '-Dlibrapidjson=/local/workspace/tappas/core/open_source/rapidjson', '-Dinclude_unit_tests=false']' returned non-zero exit status 1.

I am almost certain that I am doing something wrong but would be glad, if someone could help me getting the development environment ready to cross-compile the example apps and to get started with developing y own application.

The previous meson issues was only a configuration issue.

However, I still run into problems when cross compiling the ai_example_app. There are some definition issues, e.g.:

...
../../../../core/hailo/apps/hailo15/ai_example_app/infra/ai_stage.hpp:127:114: error: 'CMA' was not declared in this scope
  127 |                                                                                              m_output_pool_size, CMA, tensor_size, tensor_name);
      |                                                                                                                  ^~~
../../../../core/hailo/apps/hailo15/ai_example_app/infra/ai_stage.hpp: In member function 'AppStatus HailortAsyncStage::set_pix_buf(HailoMediaLibraryBufferPtr)':
../../../../core/hailo/apps/hailo15/ai_example_app/infra/ai_stage.hpp:174:39: error: 'using element_type = struct hailo_media_library_buffer' {aka 'struct hailo_media_library_buffer'} has no member named 'get_plane'; did you mean 'get_plane_fd'?
  174 |         auto y_plane_buffer = buffer->get_plane(0);
      |                                       ^~~~~~~~~
      |                                       get_plane_fd
...

Might this be due to a mismatch between the tappas docker image and the toolchain? The docker image is intended for accelerators.

As mentioned above, I cannot find the prebuilt tappas docker images for the vision processor anywhere. The manual installation guide mentions to download tappas from Hailo developer zone tappas_VERSION_linux_installer.zip, which I also can not find when selecting ‘Vision Processor’ in the download section.

Some help regarding the setup of Tappas on a Linux host for the Hailo 15 Vision Processor would be much appreciated, so that I can start developing my own C++ vision applications.

Hi @funs

I recommend cloning this repo if you want to modify the code.
Make sure to switch to the branch corresponding to the image you burned.

This repo contains all the relevant documentation and instructions on how to compile.
If you run into any issues while compiling, let me know.

Hello @lihis

Thank you for your post. I checked out the repo you mentioned. Unfortunately, I still have troubles getting things up and running.

The documentation describing the installation still mentions do download the following:

For Docker

Download from Hailo developer zone tappas_VERSION_ARCH_docker.zip and unzip the file, it should contain the following files:

For Vision Processor, this file does not exist, at least not where the corresponding file for the accelerator would be:

The Vision Processor Software Package does also not contain the Tappas package.

Manual Install

According to the manual installation instruction, I should install HailoRT and then Tappas:

But again, those files do not exist in the Software Download Center or the Vision Processor Software Package

I nevertheless tried to build a docker image using the build script in this repo:

./build_docker.sh --target-platform hailo15 --ubuntu-version 22.04

which resulted in the (expected) error

dpkg: error: cannot access archive ‘hailort/hailort_*_amd64.deb’: No such file or directory

because this package does not exist.

Is there an up to date guide on how to get Tappas with a SDK running in a docker container on a x86 Linux host for cross compilation for the Hailo15 platform? Or am just not seeing the relevant information?

The development setup I am trying to achieve is the following:

  • x86 host (Ubuntu 22.04) with IDE
    • Docker Container
      • Tappas and all relevant libraries
      • SDK for cross compilation, copied into docker container from Vision Processor Software Package
      • C++ ai_example_app for experimentation

Or is this setup incorrect?

Thank you again for your help!

Hi @funs,

You don’t need to install TAPPAS or any Docker container. Simply clone the repository to your host machine, modify the code, and then cross-compile it.

You’ll also need the SDK for cross-compilation, which is included in the Hailo Vision Processor Software Package.

Here are the steps:

  1. Download the “Vision Processor Software Package” from the developer zone
  2. Extract the software package, then extract and install the SDK:
cd hailo_vision_processor_sw_package_<VERSION>/prebuilt/sbc/sdk
./poky-glibc-x86_64-core-image-minimal-armv8a-hailo15-sbc-toolchain-<X.X.X>.sh
  1. Clone the hailo-camera-apps repository
git clone https://github.com/hailo-ai/hailo-camera-apps.git
cd hailo-camera-apps
  1. Modify the code in the AI example app as needed.
  2. Cross-compile using the cross_compile_native_apps.py script located in hailo-camera-apps/tools/cross_compiler/:
python ./cross_compile_native_apps.py hailo15 release <PATH_TO_TOOLCHAIN> --remote-machine-ip 10.0.0.1

Let me know if you run into any issues or have any questions!

Hi @lihis

Thanks for the clarification. I got everything to work and am now able to start developing, thank you very much!

One thing I noticed when running the ai_example_app out of the box, i.e. without modifications, is that it is very laggy and of poor quality with many lost frames. I am connected via Ethernet as per instructions.

I tried to stream only a FHD resolution (replaced the 4K stream with a FHD stream), but the result is similar.

Another example application, e.g. the gstreamer detection example (hailo-camera-apps/apps/h15/gstreamer/detection at 1.6.1 · hailo-ai/hailo-camera-apps · GitHub) , the resulting stream is much smoother and almost no lost frames. It also uses a FHD resolution for streaming.

Is this performance difference due to the architecture of the pipeline (tiling, two-stage inference and tracking)? Or is there another explanation?

Thanks!

Hi @funs

How did you change the stream resolution in the AI example?

Hello @lihis

The problem was with the bitrate, which was set to 16000000. I guess this was to high for my setup, I think my usb-ethernet adapter for the laptop was the issue. I reduced it to 6000000 and it works well.

Regarding your question, I changed the configuration for the in the frontend_config.json and the corresponding lines in the code so I was streaming a FHD stream instead of a 4K stream. But with the lower bitrate, it works fine also with 4K.

Thank you again very much for your help

@lihis
Could you please verify the following issue? I’m using the Hailo-15H for face detection with the SCRFD_10g model and running a GStreamer pipeline. However, the monitoring feature is not working on the host machine. Below is the script I’m trying to run:

Summary

This #!/bin/bash

set -e

CURRENT_DIR=“$(dirname “$(realpath “$BASH_SOURCE[0]”)”)”

function init_variables() {

readonly RESOURCES_DIR=“${CURRENT_DIR}/resources”

readonly POSTPROCESS_DIR=“/usr/lib/hailo-post-processes”

readonly DEFAULT_POSTPROCESS_SO=“$POSTPROCESS_DIR/libface_detection_post.so”

readonly DEFAULT_HEF_PATH=“${RESOURCES_DIR}/scrfd_10g.hef”

readonly DEFAULT_JSON_CONFIG_PATH=“$RESOURCES_DIR/configs/scrfd.json”

readonly DEFAULT_FRONTEND_CONFIG_FILE_PATH=“$RESOURCES_DIR/configs/frontend_config.json”

readonly DEFAULT_ENCODER_CONFIG_PATH=“$RESOURCES_DIR/configs/encoder_config.json”

readonly DEFAULT_NETWORK_NAME=“scrfd_10g”

readonly DEFAULT_VIDEO_SOURCE=“/dev/video0”

readonly DEFAULT_UDP_PORT=5000 # Changed to avoid conflict with detection.sh

readonly DEFAULT_UDP_HOST_IP=“10.0.0.2”

readonly DEFAULT_FRAMERATE=“30/1”

readonly DEFAULT_BITRATE=25000000

encoder_config_path=“$DEFAULT_ENCODER_CONFIG_PATH”

postprocess_so=“$DEFAULT_POSTPROCESS_SO”

network_name=“$DEFAULT_NETWORK_NAME”

hef_path=“$DEFAULT_HEF_PATH”

json_config_path=“$DEFAULT_JSON_CONFIG_PATH”

frontend_config_file_path=“$DEFAULT_FRONTEND_CONFIG_FILE_PATH”

udp_host_ip=“$DEFAULT_UDP_HOST_IP”

udp_port=“$DEFAULT_UDP_PORT”

sync_pipeline=false

framerate=“$DEFAULT_FRAMERATE”

max_buffers_size=5

bitrate=“$DEFAULT_BITRATE”

encoding_hdr=“hdr=false”

print_gst_launch_only=false

additional_parameters=“”

#debug_mode=false

mode=“daylight”

tuning_extension=“”

project=“hailo15h”

}

function print_usage() {

echo “Hailo15 Face Detection pipeline usage:”

echo “”

echo “Options:”

echo " --help Show this help"

echo " --show-fps Print fps"

echo " --print-gst-launch Print gst-launch command without running"

echo " --debug Enable debug mode with verbose output"

echo " --port PORT Set UDP port (default: 5001)"

echo " --mode mode (e.g., daylight)"

echo " --tuning tuning extension - relevant only for denoise (e.g., _r0225)"

echo " --project project name (e.g., hailo15h)"

exit 0

}

function parse_args() {

while test $# -gt 0; do

if [ “$1” = “–help” ] || [ “$1” == “-h” ]; then

print_usage

exit 0

elif [ “$1” = “–print-gst-launch” ]; then

print_gst_launch_only=true

elif [ “$1” = “–show-fps” ]; then

echo “Printing fps”

additional_parameters=“-v | grep hailo_display”

elif [ “$1” = “–debug” ]; then

debug_mode=true

additional_parameters=“-v”

elif [ “$1” = “–port” ]; then

shift

udp_port=“$1”

else

echo “Received invalid argument: $1. See expected arguments below:”

print_usage

exit 1

fi

shift

done

}

function check_files() {

echo “Checking required files…”

local missing_files=()

if [ ! -f “$hef_path” ]; then

missing_files+=(“HEF file: $hef_path”)

fi

if [ ! -f “$json_config_path” ]; then

missing_files+=(“JSON config: $json_config_path”)

fi

if [ ! -f “$postprocess_so” ]; then

missing_files+=(“Post-process library: $postprocess_so”)

fi

if [ ! -f “$frontend_config_file_path” ]; then

missing_files+=(“Frontend config: $frontend_config_file_path”)

fi

if [ ! -f “$encoder_config_path” ]; then

missing_files+=(“Encoder config: $encoder_config_path”)

fi

if [ ${#missing_files[@]} -gt 0 ]; then

echo “ERROR: Missing required files:”

for file in “${missing_files[@]}”; do

echo " - $file"

done

exit 1

fi

echo “All required files found.”

}

init_variables $@

parse_args $@

# Check if files exist

check_files

UDP_SINK=“udpsink host=$udp_host_ip port=$udp_port”

PIPELINE="gst-launch-1.0 \

hailofrontendbinsrc config-file-path=$frontend_config_file_path name=frontend \\

frontend. ! \\

queue leaky=no max-size-buffers=$max_buffers_size max-size-bytes=0 max-size-time=0 ! \\

hailonet hef-path=$hef_path scheduling-algorithm=1 vdevice-group-id=device0 ! \\

queue leaky=no max-size-buffers=$max_buffers_size max-size-bytes=0 max-size-time=0 ! \\

hailofilter function-name=$network_name config-path=$json_config_path so-path=$postprocess_so qos=false ! \\

queue leaky=no max-size-buffers=$max_buffers_size max-size-bytes=0 max-size-time=0 ! \\

hailooverlay qos=false ! \\

queue leaky=no max-size-buffers=$max_buffers_size max-size-bytes=0 max-size-time=0 ! \\

hailoencodebin config-file-path=$encoder_config_path ! h264parse config-interval=-1 ! \\

video/x-h264,framerate=$framerate ! \\

tee name=udp_tee \\

udp_tee. ! \\

    queue leaky=no max-size-buffers=$max_buffers_size max-size-bytes=0 max-size-time=0 ! \\

    rtph264pay ! 'application/x-rtp, media=(string)video, encoding-name=(string)H264' ! \\

$UDP_SINK name=udp_sink sync=$sync_pipeline \

udp_tee. ! \\

    queue leaky=no max-size-buffers=$max_buffers_size max-size-bytes=0 max-size-time=0 ! \\

    fpsdisplaysink fps-update-interval=2000 video-sink=fakesink name=hailo_display sync=$sync_pipeline text-overlay=false \\

${additional_parameters}"

echo “Running $network_name”

echo “UDP streaming to: $udp_host_ip:$udp_port”

echo “”

if [ “$debug_mode” = true ]; then

echo “=== DEBUG INFO ===”

echo “HEF Path: $hef_path”

echo “JSON Config: $json_config_path”

echo “Post-process SO: $postprocess_so”

echo “Frontend Config: $frontend_config_file_path”

echo “Encoder Config: $encoder_config_path”

echo “==================”

echo “”

fi

echo “Pipeline command:”

echo “${PIPELINE}”

echo “”

if [ “$print_gst_launch_only” = true ]; then

exit 0

fi

eval ${PIPELINE}text will be hidden

Hello @lihis, I followed your instruction but I got this:

python3 ./cross_compile_native_apps.py hailo15 debug /opt/poky/4.0.23/
INFO:/home/ruslan/projects/tassvision/hailo/hailo-camera-apps/tools/cross_compiler/common.py:Toolchain has been already unpacked and installed successfully. Skipping
INFO:/home/ruslan/projects/tassvision/hailo/hailo-camera-apps/tools/cross_compiler/common.py:Building native-apps
Build dir: /home/ruslan/projects/tassvision/hailo/hailo-camera-apps/tools/cross_compiler/armv8a-native-apps-build-debug/native
INFO:/home/ruslan/projects/tassvision/hailo/hailo-camera-apps/tools/cross_compiler/common.py:Running Meson build.
Traceback (most recent call last):
  File "/home/ruslan/projects/tassvision/hailo/hailo-camera-apps/tools/cross_compiler/./cross_compile_native_apps.py", line 90, in <module>
    gst_installer.build(args.limit_jobs)
  File "/home/ruslan/projects/tassvision/hailo/hailo-camera-apps/tools/cross_compiler/common.py", line 290, in build
    self.run_meson_build_command(env)
  File "/home/ruslan/projects/tassvision/hailo/hailo-camera-apps/tools/cross_compiler/common.py", line 212, in run_meson_build_command
    build_cmd = self.get_meson_build_command()
  File "/home/ruslan/projects/tassvision/hailo/hailo-camera-apps/tools/cross_compiler/./cross_compile_native_apps.py", line 53, in get_meson_build_command
    raise FileNotFoundError(f"One or more of the external packages are missing. Please run {TAPPAS_WORKSPACE}/scripts/build_scripts/clone_external_packages.sh")
FileNotFoundError: One or more of the external packages are missing. Please run /home/ruslan/projects/tassvision/hailo/hailo-camera-apps/scripts/build_scripts/clone_external_packages.sh

Could you help to solve?

Hi

run /home/ruslan/projects/tassvision/hailo/hailo-camera-apps/scripts/build_scripts/clone_external_packages.sh

1 Like

@lihis Thank you so much!

Hello @lihis , I hope you are doing well!
After successful built I replace ai_example_app with new one. And, I got this error how can I fix it?
./ai_example_app: error while loading shared libraries: libhailo_reference_camera.so.1: cannot open shared object file: No such file or directory