I try to run the example streaming by python from github code example. I follow tutorial to the last step : running the yolox_stream_inference.py then I got an error like:
./yolox_stream_inference.py
Traceback (most recent call last):
File “/home/aaeon/Hailo-Application-Code-Examples/runtime/python/streaming/./yolox_stream_inference.py”, line 8, in
from hailo_platform import (HEF, Device, VDevice, HailoStreamInterface, InferVStreams, ConfigureParams,
File “/home/aaeon/.local/lib/python3.10/site-packages/hailo_platform/init.py”, line 17, in
import hailo_platform.pyhailort._pyhailort as _pyhailort
ImportError: libhailort.so.4.20.0: cannot open shared object file: No such file or directory
I go to .local/lib/python3.10/site-packages/hailo_platform/pyhailort and there are some file (pycache, control_object.py, ethernet_utils.py, hailo_control_protocol.py, hw_object.py, i2c_slaves.py, init.py, power_measurement.py, _pyhailort.cpython-310-x86_64-linux-gnu.so, pyhailort.py).
I followed this guidance (https://github.com/hailo-ai/Hailo-Application-Code-Examples), and it is not required that hailortcli and tapas be installed. I thought tappas were just examples. I will install hailocli and tappas and try again. I drew a figure to represent my thoughts about the hailo workflow. I don’t know whether it is correct. Can you check it for me? Thank you!
Hi @hung.dd
The suggested hailortcli --version is just a way to quickly check the HailoRT library version installed in your system.
While TAPPAS is not mandatory, the hailortcli tool should be available, as it is installed together with the HailoRT library.
Regarding your scheme is not 100% correct. It is more like this:
The Python APIs (bindings)-> HailoRT → HailoRT PCIe Driver.
The error you are seeing may be due to a missing HailoRT library, or a misalignment in the PyHailoRT/HailORT versions. How did you perform the installation?
If you used the .whl file, please be aware that it just installs the Python bindings, and not the HailoRT library.
Depending on your system, the HailoRT library can be installed in different ways (.deb file, sources, …). Please refer to the HailoRT User Guide for the detailed procedure.
pi@raspberrypi:~/tappas/apps/h8/gstreamer/raspberrypi/detection $ ./detection.sh
WARN: HailoRT version is 4.19, expected to be 4.20 (version was extraced using the following command: ‘hailortcli fw-control identify’)
Hello,
I do every thing from start, i install hailort(deb) → hailo PCIE(deb) → create a venv → hailort(.whl)-> model zoo ->dependency like numpy(1.x version) ->opencv → run example code “yolox_stream_inference.py” get an error (need tensoflow)-> install tensorflow ->run example code again and get this error