Absurd Package Requirements

Hello, I reached to the point where I do:
hailomz compile yolov8s --ckpt=cybest.onnx --hw-arch hailo8l --calib-path train/images --classes 2 --performance

The problem, however, lies in the package requirements,.

If Numpy version 1.23 is installed, then Numba says it is incompatible, and I get an import error: ImportError: Numba needs NumPy 1.24 or greater. Got NumPy 1.23.

I tried installing Numpy 1.24, but then tensorflow and hailo-dataflow-compiler are now incompatible and require Numpy 1.23.

How can I fix this? I ran the Hailo DFC version 3.30, and I followed these exact steps from this website: Raspberry Pi AI Kit: ONNX to HEF Conversion

Here are the outputs:

1- When I set numpy==1.24

(hailodfc) hasan@MohammedPC:~/hailo_model_zoo$ hailomz compile yolov8s --ckpt=YV5s.onnx --hw-arch hailo8l --calib-path train/images --classes 1 --performance

A module that was compiled using NumPy 1.x cannot be run in
NumPy 2.1.3 as it may crash. To support both 1.x and 2.x
versions of NumPy, modules must be compiled with NumPy 2.0.
Some module may need to rebuild instead e.g. with 'pybind11>=2.12'.

If you are a user of the module, the easiest solution will be to
downgrade to 'numpy<2' or try to upgrade the affected module.
We expect that some modules will need time to support NumPy 2.

Traceback (most recent call last):  File "/home/hasan/hailodfc/bin/hailomz", line 33, in <module>
    sys.exit(load_entry_point('hailo-model-zoo', 'console_scripts', 'hailomz')())
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main.py", line 122, in main
    run(args)
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main.py", line 101, in run
    from hailo_model_zoo.main_driver import compile, evaluate, optimize, parse, profile
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 10, in <module>
    from hailo_sdk_client import ClientRunner, InferenceContext
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/hailo_sdk_client/__init__.py", line 25, in <module>
    import hailo_model_optimization  # noqa: F401
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/__init__.py", line 18, in <module>
    import tensorflow as tf  # noqa: E402
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/__init__.py", line 37, in <module>
    from tensorflow.python.tools import module_util as _module_util
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/__init__.py", line 37, in <module>
    from tensorflow.python.eager import context
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/eager/context.py", line 33, in <module>
    from tensorflow.python.client import pywrap_tf_session
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/client/pywrap_tf_session.py", line 19, in <module>
    from tensorflow.python.client._pywrap_tf_session import *
AttributeError: _ARRAY_API not found

A module that was compiled using NumPy 1.x cannot be run in
NumPy 2.1.3 as it may crash. To support both 1.x and 2.x
versions of NumPy, modules must be compiled with NumPy 2.0.
Some module may need to rebuild instead e.g. with 'pybind11>=2.12'.

If you are a user of the module, the easiest solution will be to
downgrade to 'numpy<2' or try to upgrade the affected module.
We expect that some modules will need time to support NumPy 2.

Traceback (most recent call last):  File "/home/hasan/hailodfc/bin/hailomz", line 33, in <module>
    sys.exit(load_entry_point('hailo-model-zoo', 'console_scripts', 'hailomz')())
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main.py", line 122, in main
    run(args)
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main.py", line 101, in run
    from hailo_model_zoo.main_driver import compile, evaluate, optimize, parse, profile
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 10, in <module>
    from hailo_sdk_client import ClientRunner, InferenceContext
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/hailo_sdk_client/__init__.py", line 25, in <module>
    import hailo_model_optimization  # noqa: F401
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/__init__.py", line 18, in <module>
    import tensorflow as tf  # noqa: E402
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/__init__.py", line 37, in <module>
    from tensorflow.python.tools import module_util as _module_util
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/__init__.py", line 42, in <module>
    from tensorflow.python import data
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/__init__.py", line 21, in <module>
    from tensorflow.python.data import experimental
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/__init__.py", line 97, in <module>
    from tensorflow.python.data.experimental import service
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/service/__init__.py", line 419, in <module>
    from tensorflow.python.data.experimental.ops.data_service_ops import distribute
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/ops/data_service_ops.py", line 22, in <module>
    from tensorflow.python.data.experimental.ops import compression_ops
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/ops/compression_ops.py", line 16, in <module>
    from tensorflow.python.data.util import structure
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/util/structure.py", line 22, in <module>
    from tensorflow.python.data.util import nest
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/util/nest.py", line 34, in <module>
    from tensorflow.python.framework import sparse_tensor as _sparse_tensor
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/framework/sparse_tensor.py", line 25, in <module>
    from tensorflow.python.framework import constant_op
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/framework/constant_op.py", line 25, in <module>
    from tensorflow.python.eager import execute
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/eager/execute.py", line 21, in <module>
    from tensorflow.python.framework import dtypes
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/framework/dtypes.py", line 29, in <module>
    from tensorflow.python.lib.core import _pywrap_bfloat16
AttributeError: _ARRAY_API not found
ImportError: numpy.core._multiarray_umath failed to import
ImportError: numpy.core.umath failed to import

A module that was compiled using NumPy 1.x cannot be run in
NumPy 2.1.3 as it may crash. To support both 1.x and 2.x
versions of NumPy, modules must be compiled with NumPy 2.0.
Some module may need to rebuild instead e.g. with 'pybind11>=2.12'.

If you are a user of the module, the easiest solution will be to
downgrade to 'numpy<2' or try to upgrade the affected module.
We expect that some modules will need time to support NumPy 2.

Traceback (most recent call last):  File "/home/hasan/hailodfc/bin/hailomz", line 33, in <module>
    sys.exit(load_entry_point('hailo-model-zoo', 'console_scripts', 'hailomz')())
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main.py", line 122, in main
    run(args)
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main.py", line 101, in run
    from hailo_model_zoo.main_driver import compile, evaluate, optimize, parse, profile
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 10, in <module>
    from hailo_sdk_client import ClientRunner, InferenceContext
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/hailo_sdk_client/__init__.py", line 25, in <module>
    import hailo_model_optimization  # noqa: F401
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/__init__.py", line 18, in <module>
    import tensorflow as tf  # noqa: E402
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/__init__.py", line 37, in <module>
    from tensorflow.python.tools import module_util as _module_util
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/__init__.py", line 42, in <module>
    from tensorflow.python import data
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/__init__.py", line 21, in <module>
    from tensorflow.python.data import experimental
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/__init__.py", line 97, in <module>
    from tensorflow.python.data.experimental import service
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/service/__init__.py", line 419, in <module>
    from tensorflow.python.data.experimental.ops.data_service_ops import distribute
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/ops/data_service_ops.py", line 22, in <module>
    from tensorflow.python.data.experimental.ops import compression_ops
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/ops/compression_ops.py", line 16, in <module>
    from tensorflow.python.data.util import structure
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/util/structure.py", line 22, in <module>
    from tensorflow.python.data.util import nest
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/util/nest.py", line 34, in <module>
    from tensorflow.python.framework import sparse_tensor as _sparse_tensor
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/framework/sparse_tensor.py", line 25, in <module>
    from tensorflow.python.framework import constant_op
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/framework/constant_op.py", line 25, in <module>
    from tensorflow.python.eager import execute
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/eager/execute.py", line 21, in <module>
    from tensorflow.python.framework import dtypes
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/framework/dtypes.py", line 31, in <module>
    from tensorflow.python.lib.core import _pywrap_float8
AttributeError: _ARRAY_API not found
ImportError: numpy.core._multiarray_umath failed to import
ImportError: numpy.core.umath failed to import

A module that was compiled using NumPy 1.x cannot be run in
NumPy 2.1.3 as it may crash. To support both 1.x and 2.x
versions of NumPy, modules must be compiled with NumPy 2.0.
Some module may need to rebuild instead e.g. with 'pybind11>=2.12'.

If you are a user of the module, the easiest solution will be to
downgrade to 'numpy<2' or try to upgrade the affected module.
We expect that some modules will need time to support NumPy 2.

Traceback (most recent call last):  File "/home/hasan/hailodfc/bin/hailomz", line 33, in <module>
    sys.exit(load_entry_point('hailo-model-zoo', 'console_scripts', 'hailomz')())
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main.py", line 122, in main
    run(args)
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main.py", line 101, in run
    from hailo_model_zoo.main_driver import compile, evaluate, optimize, parse, profile
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 10, in <module>
    from hailo_sdk_client import ClientRunner, InferenceContext
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/hailo_sdk_client/__init__.py", line 25, in <module>
    import hailo_model_optimization  # noqa: F401
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/__init__.py", line 18, in <module>
    import tensorflow as tf  # noqa: E402
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/__init__.py", line 37, in <module>
    from tensorflow.python.tools import module_util as _module_util
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/__init__.py", line 42, in <module>
    from tensorflow.python import data
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/__init__.py", line 21, in <module>
    from tensorflow.python.data import experimental
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/__init__.py", line 97, in <module>
    from tensorflow.python.data.experimental import service
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/service/__init__.py", line 419, in <module>
    from tensorflow.python.data.experimental.ops.data_service_ops import distribute
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/ops/data_service_ops.py", line 22, in <module>
    from tensorflow.python.data.experimental.ops import compression_ops
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/ops/compression_ops.py", line 16, in <module>
    from tensorflow.python.data.util import structure
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/util/structure.py", line 22, in <module>
    from tensorflow.python.data.util import nest
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/util/nest.py", line 34, in <module>
    from tensorflow.python.framework import sparse_tensor as _sparse_tensor
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/framework/sparse_tensor.py", line 25, in <module>
    from tensorflow.python.framework import constant_op
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/framework/constant_op.py", line 25, in <module>
    from tensorflow.python.eager import execute
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/eager/execute.py", line 21, in <module>
    from tensorflow.python.framework import dtypes
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/framework/dtypes.py", line 31, in <module>
    from tensorflow.python.lib.core import _pywrap_float8
AttributeError: _ARRAY_API not found
ImportError: numpy.core._multiarray_umath failed to import
ImportError: numpy.core.umath failed to import
Traceback (most recent call last):
  File "/home/hasan/hailodfc/bin/hailomz", line 33, in <module>
    sys.exit(load_entry_point('hailo-model-zoo', 'console_scripts', 'hailomz')())
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main.py", line 122, in main
    run(args)
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main.py", line 101, in run
    from hailo_model_zoo.main_driver import compile, evaluate, optimize, parse, profile
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 10, in <module>
    from hailo_sdk_client import ClientRunner, InferenceContext
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/hailo_sdk_client/__init__.py", line 25, in <module>
    import hailo_model_optimization  # noqa: F401
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/__init__.py", line 18, in <module>
    import tensorflow as tf  # noqa: E402
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/__init__.py", line 37, in <module>
    from tensorflow.python.tools import module_util as _module_util
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/__init__.py", line 42, in <module>
    from tensorflow.python import data
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/__init__.py", line 21, in <module>
    from tensorflow.python.data import experimental
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/__init__.py", line 97, in <module>
    from tensorflow.python.data.experimental import service
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/service/__init__.py", line 419, in <module>
    from tensorflow.python.data.experimental.ops.data_service_ops import distribute
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/ops/data_service_ops.py", line 22, in <module>
    from tensorflow.python.data.experimental.ops import compression_ops
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/experimental/ops/compression_ops.py", line 16, in <module>
    from tensorflow.python.data.util import structure
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/util/structure.py", line 22, in <module>
    from tensorflow.python.data.util import nest
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/data/util/nest.py", line 34, in <module>
    from tensorflow.python.framework import sparse_tensor as _sparse_tensor
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/framework/sparse_tensor.py", line 25, in <module>
    from tensorflow.python.framework import constant_op
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/framework/constant_op.py", line 25, in <module>
    from tensorflow.python.eager import execute
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/eager/execute.py", line 21, in <module>
    from tensorflow.python.framework import dtypes
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/tensorflow/python/framework/dtypes.py", line 37, in <module>
    _np_bfloat16 = _pywrap_bfloat16.TF_bfloat16_type()
TypeError: Unable to convert function return value to a Python type! The signature was
        () -> handle

2- Then, when ChatGPT suggested to lower Numpy version to 1.23, it gave me this error:

(hailodfc) hasan@MohammedPC:~/hailo_model_zoo$ hailomz compile yolov5s --ckpt=YV5s.onnx --hw-arch hailo8l --calib-path train/images --classes 1 --performance
Traceback (most recent call last):
  File "/home/hasan/hailodfc/bin/hailomz", line 33, in <module>
    sys.exit(load_entry_point('hailo-model-zoo', 'console_scripts', 'hailomz')())
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main.py", line 122, in main
    run(args)
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main.py", line 101, in run
    from hailo_model_zoo.main_driver import compile, evaluate, optimize, parse, profile
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 16, in <module>
    from hailo_model_zoo.core.main_utils import (
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/core/main_utils.py", line 11, in <module>
    from hailo_model_zoo.core.eval import eval_factory
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/core/eval/eval_factory.py", line 9, in <module>
    discovered_plugins = {name: importlib.import_module(name) for _, name, _ in iter_namespace(hailo_model_zoo.core.eval)}
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/core/eval/eval_factory.py", line 9, in <dictcomp>
    discovered_plugins = {name: importlib.import_module(name) for _, name, _ in iter_namespace(hailo_model_zoo.core.eval)}
  File "/usr/lib/python3.10/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/core/eval/face_detection_evaluation.py", line 6, in <module>
    from hailo_model_zoo.core.eval.widerface_evaluation_external.evaluation import (
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/core/eval/widerface_evaluation_external/evaluation.py", line 3, in <module>
    from .python_box_overlaps import bbox_overlaps
  File "/home/hasan/hailo_model_zoo/hailo_model_zoo/core/eval/widerface_evaluation_external/python_box_overlaps.py", line 2, in <module>
    from numba import njit
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/numba/__init__.py", line 59, in <module>
    _ensure_critical_deps()
  File "/home/hasan/hailodfc/lib/python3.10/site-packages/numba/__init__.py", line 40, in _ensure_critical_deps
    raise ImportError(msg)
ImportError: Numba needs NumPy 1.24 or greater. Got NumPy 1.23.

I would really appreciate anyone who would help me for this as this is urgent for an engineering graduation project, I just need a yolov5 drone model in .hef format for yolov5n, yolov5s, yolov5m, and yolov5l.

I coded a program that acts as an early warning system for drone detection using CCTV cameras, and it is based on Hailo-8L, I would love if someone could provide me with a .hef model for drones.

Thanks!

1 Like

Hi @kryptonicrevolution,
Thank for sharing your experience.

I would like to suggest a couple if things-

  1. Use our SW suite pre-baked docker, all the SW packages are alreayd installed, and so you would not need to care about it
  2. If you do not want to resort to the Docker option, please try a clean virtual environment, dedicated to the DFC. once you have a HEF, you can use it in your usual development environment.

Thanks so much for the reply, I am quite new to this, and I do not have experience with dockers, is there a guide for doing it with docker? I also have an Ubuntu 22.04 LTS VM, in which I am trying to convert my onnx file into hef in.

Yes, please refer to this page:
2025-01 | Hailo

Thank you so much for the help, also, I wanted to ask, is there a pre-converted hef model for drones? It would cut a long process for me since my hardware isn’t really that great.

I went to the page and it says “You do not have permission to view this page”

All the models in our model-zoo are also shared pre-compiled. You can decide what makes sense for you to use on your drone system

Make sure that you are logged-in.