Development workflow Hailo 15H

Hello together

I have a few questions regarding the development workflow when working with a Hailo15 platform, such as the Hailo15 SBC. I am unsure as to how to start developing custom apps, specifically what the best way is for development and testing.

My setup is as follows:

  • Hailo15H SBC with two LeopardImaging cameras
  • Development machine with Ubuntu 22.04

I want to develop both Python and C++ applications. For Python applications, I guess I can directly work on the SBC. C++ applications however need to be compiled.

I would like to use my Linux host as a development machine and cross-compile C++ applications for the Hailo15 SBC. I downloaded the Software Suite Docker File and launched the Docker container.

Now however I am stuck. For experimentation, I wanted to recompile the ai_example_app but failed. There is some documentation about cross compilation here (tappas/tools/cross_compiler at master · hailo-ai/tappas · GitHub), but I am unsure if this is the correct way to do it.

Could someone explain for me in simple terms the steps I need to take in order to develop my own Python and C++ applications for the Hailo15 SBC on a Linux development host?

Thank you very much, kind regards
Steve

The Hailo AI Software Suite Docker is used to convert models from TFLite and ONNX format into HEF for Hailo-8 and Hailo-15.

The SDK is part of the Yocto build. You will find a prebuild SDK in the Hailo-15 Vision Processor SW package.
When you build your own Yocto image you can build the SDK yourself. This is described in the Hailo OS user guide.

I would recommend using C++ as much as possible to reach good performance. The Cortex-A53 processor in the Hailo-15 was designed for efficiency and not for maximum performance.

Thank you very much for your reply

I have downloaded the Hailo-15 Vision Processor SW package and located the SDK. I was also able to build a sample application, just a hello-world, on my host and run it on the HAILO 15 SBC. So far so good…

I would now like to start developing my vision application, starting by recompiling and modifying the ai_example_app in the tappas repo. I try to follow documentation. I would like to use the docker container instead of a native installation.

In the tappas master branch, not the master-vpu, I find the documentation on using a prebuilt docker container. When trying to download the corresponding files in the dev zone, the following is stated when trying to download tappas:

For vision processors, the TAPPAS package is included in the Vision Processors Software Package, to be compiled into the application processor’s image.

However, in the mentioned package, I find no reference to Tappas. I then downloaded the TAPPAS – Docker for x86_64 files and launched the docker container as explained here. As described here, I then copied the toolchain script from the Vision Processor Software Package into the running docker container and tried to compile the apps using the cross_compile_tappas script

./cross_compile_tappas.py aarch64 hailo15 debug /local/workspace/tappas/sdk/ --build-lib apps

I first had to edit the script to use python3 instead of python, as there seems to be no python binary or symlink.

After editing the scrtipt, the toolchain seems to be installed correctly, when meson tries to setup the project, the following error occurs:

INFO:/local/workspace/tappas/tools/cross_compiler/common.py:Starting the installation of the toolchain
INFO:/local/workspace/tappas/tools/cross_compiler/common.py:installing /local/workspace/tappas/sdk/poky-glibc-x86_64-core-image-minimal-armv8a-hailo15-sbc-toolchain-4.0.23.sh
INFO:/local/workspace/tappas/tools/cross_compiler/common.py:Toolchain ready to use (/local/workspace/tappas/sdk)
INFO:/local/workspace/tappas/tools/cross_compiler/common.py:Building gsthailotools plugins and post processes
Build dir: /local/workspace/tappas/tools/cross_compiler/aarch64-gsthailotools-build-debug/hailo
INFO:/local/workspace/tappas/tools/cross_compiler/common.py:Running Meson build.
meson-wrapper: Implicit setup command assumed
meson-wrapper: Running meson with setup options: " --cross-file=/local/workspace/tappas/sdk/sysroots/x86_64-pokysdk-linux/usr/share/meson/aarch64-poky-linux-meson.cross --native-file=/local/workspace/tappas/sdk/sysroots/x86_64-pokysdk-linux/usr/share/meson/meson.native "
The Meson build system
Version: 0.61.3
Source dir: /local/workspace/tappas/core/hailo
Build dir: /local/workspace/tappas/tools/cross_compiler/aarch64-gsthailotools-build-debug/hailo
Build type: cross build

meson.build:3:0: ERROR: Unknown options: "include_unit_tests"

A full log can be found at /local/workspace/tappas/tools/cross_compiler/aarch64-gsthailotools-build-debug/hailo/meson-logs/meson-log.txt
Traceback (most recent call last):
  File "/local/workspace/tappas/tools/cross_compiler/./cross_compile_tappas.py", line 98, in <module>
    gst_installer.build()
  File "/local/workspace/tappas/tools/cross_compiler/common.py", line 285, in build
    self.run_meson_build_command(env)
  File "/local/workspace/tappas/tools/cross_compiler/common.py", line 208, in run_meson_build_command
    self._runner.run(build_cmd, env=env, print_output=True)
  File "/local/workspace/tappas/tools/cross_compiler/common.py", line 113, in run
    p.check_returncode()
  File "/usr/lib/python3.10/subprocess.py", line 457, in check_returncode
    raise CalledProcessError(self.returncode, self.args, self.stdout,
subprocess.CalledProcessError: Command '['meson', '/local/workspace/tappas/tools/cross_compiler/aarch64-gsthailotools-build-debug/hailo', '--buildtype', 'debug', '-Dlibargs=-I/local/workspace/tappas/sdk/sysroots/aarch64-poky-linux/usr/include/hailort,-I/local/workspace/tappas/sdk/sysroots/aarch64-poky-linux/usr/include/gst-hailo/metadata,-std=c++17', '-Dprefix=/local/workspace/tappas/sdk/sysroots/aarch64-poky-linux/usr', '-Dinclude_blas=false', '-Dtarget_platform=hailo15', '-Dtarget=apps', '-Dlibxtensor=/local/workspace/tappas/core/open_source/xtensor_stack/base', '-Dlibblas=/local/workspace/tappas/core/open_source/xtensor_stack/blas', '-Dlibcxxopts=/local/workspace/tappas/core/open_source/cxxopts', '-Dlibrapidjson=/local/workspace/tappas/core/open_source/rapidjson', '-Dinclude_unit_tests=false']' returned non-zero exit status 1.

I am almost certain that I am doing something wrong but would be glad, if someone could help me getting the development environment ready to cross-compile the example apps and to get started with developing y own application.

The previous meson issues was only a configuration issue.

However, I still run into problems when cross compiling the ai_example_app. There are some definition issues, e.g.:

...
../../../../core/hailo/apps/hailo15/ai_example_app/infra/ai_stage.hpp:127:114: error: 'CMA' was not declared in this scope
  127 |                                                                                              m_output_pool_size, CMA, tensor_size, tensor_name);
      |                                                                                                                  ^~~
../../../../core/hailo/apps/hailo15/ai_example_app/infra/ai_stage.hpp: In member function 'AppStatus HailortAsyncStage::set_pix_buf(HailoMediaLibraryBufferPtr)':
../../../../core/hailo/apps/hailo15/ai_example_app/infra/ai_stage.hpp:174:39: error: 'using element_type = struct hailo_media_library_buffer' {aka 'struct hailo_media_library_buffer'} has no member named 'get_plane'; did you mean 'get_plane_fd'?
  174 |         auto y_plane_buffer = buffer->get_plane(0);
      |                                       ^~~~~~~~~
      |                                       get_plane_fd
...

Might this be due to a mismatch between the tappas docker image and the toolchain? The docker image is intended for accelerators.

As mentioned above, I cannot find the prebuilt tappas docker images for the vision processor anywhere. The manual installation guide mentions to download tappas from Hailo developer zone tappas_VERSION_linux_installer.zip, which I also can not find when selecting ‘Vision Processor’ in the download section.

Some help regarding the setup of Tappas on a Linux host for the Hailo 15 Vision Processor would be much appreciated, so that I can start developing my own C++ vision applications.

Hi @funs

I recommend cloning this repo if you want to modify the code.
Make sure to switch to the branch corresponding to the image you burned.

This repo contains all the relevant documentation and instructions on how to compile.
If you run into any issues while compiling, let me know.

Hello @lihis

Thank you for your post. I checked out the repo you mentioned. Unfortunately, I still have troubles getting things up and running.

The documentation describing the installation still mentions do download the following:

For Docker

Download from Hailo developer zone tappas_VERSION_ARCH_docker.zip and unzip the file, it should contain the following files:

For Vision Processor, this file does not exist, at least not where the corresponding file for the accelerator would be:

The Vision Processor Software Package does also not contain the Tappas package.

Manual Install

According to the manual installation instruction, I should install HailoRT and then Tappas:

But again, those files do not exist in the Software Download Center or the Vision Processor Software Package

I nevertheless tried to build a docker image using the build script in this repo:

./build_docker.sh --target-platform hailo15 --ubuntu-version 22.04

which resulted in the (expected) error

dpkg: error: cannot access archive ‘hailort/hailort_*_amd64.deb’: No such file or directory

because this package does not exist.

Is there an up to date guide on how to get Tappas with a SDK running in a docker container on a x86 Linux host for cross compilation for the Hailo15 platform? Or am just not seeing the relevant information?

The development setup I am trying to achieve is the following:

  • x86 host (Ubuntu 22.04) with IDE
    • Docker Container
      • Tappas and all relevant libraries
      • SDK for cross compilation, copied into docker container from Vision Processor Software Package
      • C++ ai_example_app for experimentation

Or is this setup incorrect?

Thank you again for your help!

Hi @funs,

You don’t need to install TAPPAS or any Docker container. Simply clone the repository to your host machine, modify the code, and then cross-compile it.

You’ll also need the SDK for cross-compilation, which is included in the Hailo Vision Processor Software Package.

Here are the steps:

  1. Download the “Vision Processor Software Package” from the developer zone
  2. Extract the software package, then extract and install the SDK:
cd hailo_vision_processor_sw_package_<VERSION>/prebuilt/sbc/sdk
./poky-glibc-x86_64-core-image-minimal-armv8a-hailo15-sbc-toolchain-<X.X.X>.sh
  1. Clone the hailo-camera-apps repository
git clone https://github.com/hailo-ai/hailo-camera-apps.git
cd hailo-camera-apps
  1. Modify the code in the AI example app as needed.
  2. Cross-compile using the cross_compile_native_apps.py script located in hailo-camera-apps/tools/cross_compiler/:
python ./cross_compile_native_apps.py hailo15 release <PATH_TO_TOOLCHAIN> --remote-machine-ip 10.0.0.1

Let me know if you run into any issues or have any questions!

Hi @lihis

Thanks for the clarification. I got everything to work and am now able to start developing, thank you very much!

One thing I noticed when running the ai_example_app out of the box, i.e. without modifications, is that it is very laggy and of poor quality with many lost frames. I am connected via Ethernet as per instructions.

I tried to stream only a FHD resolution (replaced the 4K stream with a FHD stream), but the result is similar.

Another example application, e.g. the gstreamer detection example (hailo-camera-apps/apps/h15/gstreamer/detection at 1.6.1 · hailo-ai/hailo-camera-apps · GitHub) , the resulting stream is much smoother and almost no lost frames. It also uses a FHD resolution for streaming.

Is this performance difference due to the architecture of the pipeline (tiling, two-stage inference and tracking)? Or is there another explanation?

Thanks!

Hi @funs

How did you change the stream resolution in the AI example?

Hello @lihis

The problem was with the bitrate, which was set to 16000000. I guess this was to high for my setup, I think my usb-ethernet adapter for the laptop was the issue. I reduced it to 6000000 and it works well.

Regarding your question, I changed the configuration for the in the frontend_config.json and the corresponding lines in the code so I was streaming a FHD stream instead of a 4K stream. But with the lower bitrate, it works fine also with 4K.

Thank you again very much for your help