I’m working on deploying a custom OCR model based on LPRNet to the Hailo-8 chip using the Hailo Model Zoo v2.15.0. I’ve trained and exported the model as ONNX and am now trying to generate the .hef file using hailomz compile --yaml <path_to_yaml>.
My yaml looks this like:
network:
network_name: testmodel
paths:
network_path:
- <model e.g. test.onnx (is in same directory as yaml and where the cmd is done)>
alls_script: null
parser:
nodes: ["input", "output"]
start_node_shapes:
input: [1, 1, 32, 160]
preprocessing:
network_type: classification
evaluation:
dataset_name: dummy
classes: 37
labels_offset: 0
targets:
- device: hailo8
I get at the moment an error which tells me that the paths.url is missing. This parameter is not described in the docu and when I add url: “” or url: or url: null it complaince that the url should start with http://
Questions
What is the correct YAML configuration to compile a local ONNX model (custom LPRNet-based OCR)?
Do I need to include url, alls_script, or anything else to avoid download attempts?
What’s the simplest working config for compiling an ONNX model to HEF using only local files?
The error you’re seeing (paths.url missing) happens because your YAML is being treated as a remote model configuration. For local model compilation, you shouldn’t rely on URL-based download logic at all.
Here’s how to fix it:
Minimal YAML for Local ONNX Model Compilation
network:
network_name: testmodel
paths:
network_path:
- test.onnx # make sure this path is correct (relative or absolute)
alls_script: null
parser:
nodes: ["input", "output"]
start_node_shapes:
input: [1, 1, 32, 160] # match your model input exactly
preprocessing:
network_type: classification
evaluation:
dataset_name: dummy
classes: 37
labels_offset: 0
targets:
- device: hailo8
Key point: Don’t include url at all. Having it there makes the model zoo try to download something, which you don’t need for local models.
Recommended Flow: Parse → Optimize → Compile
Parse (Convert to Hailo Format)
hailomz parse --ckpt test.onnx --yaml config.yaml
Creates a .har file (Hailo’s parsed model format)
You can validate with hailomz eval --target emulator
Optimize (Quantize and Tune)
hailomz optimize --yaml config.yaml
Handles quantization and graph optimizations
Make sure you have calibration data (>1024 images work best)
Compile (Generate HEF)
hailomz compile --yaml config.yaml
Creates the final .hef file for Hailo-8 runtime
The 3-stage approach helps you catch issues early and validate each step!
Let me know if you hit anymore errors in doing it in this flow !