Conversion from .task to .hef

Hi,

I’m working on a Raspberry Pi 5 with the Hailo-8L accelerator (13 TOPS) and I need to convert a MediaPipe model from .task to .hef, specifically:

pose_landmarker_lite.taskpose_landmarker_lite.hef

I want to run real-time pose estimation efficiently using HailoRT instead of CPU inference.

My Questions:

  1. What is the correct process to convert a .task model to a .hef file?
  • Do I need to extract the .tflite model from the .task file first?
  • What steps are required for model quantization and compilation for Hailo?
  1. Does this conversion improve inference speed and efficiency on the Hailo-8L?
  • I’m aiming to offload inference to the accelerator for real-time processing.

Any insights or resources on MediaPipe model conversion for Hailo would be greatly appreciated! :rocket:

Thanks!

The pose_landmarker_lite.task model contains two TFLite models: pose_landmarks_detector.tflite and pose_detector.tflite (I’ve checked that their total size equals that of the pose_landmarker_lite.task file). However, converting the file to ONNX fails with the error reported below.

My question is: should these two models be separated and converted to .hef individually, or should they be merged into a single file and then converted together, how? I attempted to convert them to .hef, but I couldn’t figure out how to do it.

Error:

latex

Copia

Error in cpuinfo: prctl(PR_SVE_GET_VL) failed
INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
Traceback (most recent call last):
  File "/home/btsrasp/Mediapipe/conversion.py", line 23, in <module>
    convert_tflite_to_onnx(pose_detector_tflite, pose_detector_onnx)
  File "/home/btsrasp/Mediapipe/conversion.py", line 15, in convert_tflite_to_onnx
    onnx_model, _ = tf2onnx.convert.from_tflite(
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/btsrasp/Hailo-Application-Code-Examples/runtime/python/pose_estimation/venv/lib/python3.11/site-packages/tf2onnx/convert.py", line 689, in from_tflite
    model_proto, external_tensor_storage = _convert_common(
                                           ^^^^^^^^^^^^^^^^
  File "/home/btsrasp/Hailo-Application-Code-Examples/runtime/python/pose_estimation/venv/lib/python3.11/site-packages/tf2onnx/convert.py", line 168, in _convert_common
    g = process_tf_graph(tf_graph, const_node_values=const_node_values,
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/btsrasp/Hailo-Application-Code-Examples/runtime/python/pose_estimation/venv/lib/python3.11/site-packages/tf2onnx/tfonnx.py", line 453, in process_tf_graph
    main_g, subgraphs = graphs_from_tflite(tflite_path, input_names, output_names)
                        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/btsrasp/Hailo-Application-Code-Examples/runtime/python/pose_estimation/venv/lib/python3.11/site-packages/tf2onnx/tflite_utils.py", line 153, in graphs_from_tflite
    parse_tflite_graph(tfl_graph, opcodes, model, prefix, tensor_shapes_from_interpreter)
  File "/home/btsrasp/Hailo-Application-Code-Examples/runtime/python/pose_estimation/venv/lib/python3.11/site-packages/tf2onnx/tflite_utils.py", line 354, in parse_tflite_graph
    np_data = tensor_util.MakeNdarray(t)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/btsrasp/Hailo-Application-Code-Examples/runtime/python/pose_estimation/venv/lib/python3.11/site-packages/tensorflow/python/framework/tensor_util.py", line 674, in MakeNdarray
    dtype=dtype).copy().reshape(shape))
                        ^^^^^^^^^^^^^^
ValueError: cannot reshape array of size 96 into shape (16,1,1,24)

Hello @Simone_Tortorella,

Regarding your first question about converting MediaPipe models to Hailo format:

  1. Extract TFLite Models: You’ll need to unzip the .task file to access the component models (pose_landmarks_detector.tflite and pose_detector.tflite).

  2. Convert to Hailo Archive: Use Hailo SDK to translate the TFLite models into .har format using the DFC parse command.

  3. Optimize Through Quantization: Calibrate the model with a dataset (1024+ images recommended) to prepare it for hardware acceleration using the DFC optimize command.

  4. Compile to Hailo Executable: Transform the optimized model into a .hef file that can run on Hailo hardware using the compile command.

You have flexibility in deployment - you can either combine both models into a single HEF file, or compile each one separately and run them using the scheduler, depending on your needs.

For deployment examples, please check out our repository:

Hope this helps!