Need help for Voluntary project. Convert onnx to hef

Hello Everyone. I am working in a Multinational company and we have some voluntary projects. I joined to one but we have a very limited time, and you know big company big secrets so I should work with chains in my hands…

Project: Smart Trash
Model: Google Teachable Machine (it came from the management)
Learned a lot of but need to improve.
Format: tflite
Raspberry Pi 5 the workplace. The program is work for that but not too effective, so we thought we implement and use AI Hat + .The code are prepared and the raspberry also but we cannot convert the Model from “.onnx” into “.hef”… I tried already I could but I had only one day left and I didn’t tried the HAT so someone maybe could help me who has already have the envoirement for the convert or tell me what is the my problem with converting. (I lost the things in the Parse phase… The YAML file are cannot work and I cant figure out what happening.)

YAML:

network:
network_name: model_v1

paths:
network_path:
- /workspace/models/model_v1.onnx

parser:
nodes:
- sequential_1_input
- Identity
start_node_shapes:
sequential_1_input: [1, 3, 224, 224]

preprocessing:
network_type: classification

evaluation:
dataset_name: dummy
classes: 5
labels_offset: 0

it makes this error:

(hailo_virtualenv) hailo@hailo-vm:/workspace/models$ hailomz parse --yaml /workspace/models/model_v1.yaml
Start run for network model_v1 …
Initializing the runner…
Traceback (most recent call last):
File “/local/workspace/hailo_virtualenv/bin/hailomz”, line 33, in
sys.exit(load_entry_point(‘hailo-model-zoo’, ‘console_scripts’, ‘hailomz’)())
File “/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py”, line 122, in main
run(args)
File “/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py”, line 111, in run
return handlersargs.command
File “/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py”, line 201, in parse
parse_model(runner, network_info, ckpt_path=args.ckpt_path, results_dir=args.results_dir, logger=logger)
File “/local/workspace/hailo_model_zoo/hailo_model_zoo/core/main_utils.py”, line 126, in parse_model
raise Exception(f"Encountered error during parsing: {err}") from None
Exception: Encountered error during parsing: Expecting value: line 1 column 1 (char 0)
(hailo_virtualenv) hailo@hailo-vm:/workspace/models$

So The model are good, and the file exist, but I cannot parse.

Thank you Everyone!

Best Regards

Hey @Zsolt_Csigi,

Welcome to the Hailo Community!

For custom ONNX models, I’d recommend skipping the Model-Zoo CLI and going straight to the Hailo Dataflow Compiler CLI. Here’s the workflow I’ve been using:

ParseOptimizeCompile

Step 1: Parse ONNX to HAR

hailo parser onnx /workspace/models/model_v1.onnx \
    --start-node-names sequential_1_input \
    --end-node-names Identity \
    --net-input-shapes sequential_1_input=1,3,224,224 \
    --hw-arch hailo8l

A few things to watch out for:

  • Make sure your --start-node-names and --end-node-names match what’s in your YAML
  • The --net-input-shapes needs to match your actual input dimensions
  • Since you’re using the AI-Hat, --hw-arch hailo8l is correct

This will create model_v1.har in your current directory.

Step 2: Optimize (Quantization)

hailo optimize model_v1.har \
    --calib-path /workspace/calibration_images/ \
    --performance

The optimization step converts from FP32 to INT quantization. You’ll need at least ~1K calibration images for this to work well. The --performance flag gives you the best throughput settings.

Output: model_v1_quantized.har

Step 3: Compile to HEF

hailo compiler model_v1_quantized.har \
    --hw-arch hailo8l

This is where the magic happens - generates your final .hef file ready for the Pi 5 + AI Hat.

If you need any preprocessing on-chip, you can add flags like --input-conversion nv12_to_rgb or --resize H W.

The whole flow looks like:
ONNXHARQuantized HARHEF

Hope this helps! Let me know if you run into any issues with any of these steps.

1 Like

Hi @Zsolt_Csigi, if you still need help with this, feel free to DM me the .onnx or .tflite model file and I can try to get it compiled into .hef for you.

1 Like

Thank you. Soon I will try it!