Do you support r3d_18 in hailo_model_zoo?

I’m trying to parse the r3d_18 model with hailomz, but I get an error stating that the input rank is unsupported.

/work# hailomz parse r3d_18
[warning] Cannot use graphviz, so no visualizations will be created
<Hailo Model Zoo INFO> Start run for network r3d_18 ...
<Hailo Model Zoo INFO> Initializing the runner...
r3d_18.zip: 100%|████████████████████████████████████████████| 118M/118M [00:37<00:00, 3.28MB/s]
[info] Translation started on ONNX model r3d_18
[info] Restored ONNX model r3d_18 (completion time: 00:00:00.49)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:01.58)
[info] Simplified ONNX model for a parsing retry attempt (completion time: 00:00:02.87)
Traceback (most recent call last):
  File "/env/bin/hailomz", line 33, in <module>
    sys.exit(load_entry_point('hailo-model-zoo', 'console_scripts', 'hailomz')())
  File "/work/hailo_model_zoo/main.py", line 122, in main
    run(args)
  File "/work/hailo_model_zoo/main.py", line 111, in run
    return handlers[args.command](args)
  File "/work/hailo_model_zoo/main_driver.py", line 201, in parse
    parse_model(runner, network_info, ckpt_path=args.ckpt_path, results_dir=args.results_dir, logger=logger)
  File "/work/hailo_model_zoo/core/main_utils.py", line 127, in parse_model
    raise Exception(f"Encountered error during parsing: {err}") from None
Exception: Encountered error during parsing: Unsupported shape [1, 3, 16, 112, 112] found on input node input.1. Input shape cannot be in rank 5.

Is the operation to convert r3d_18’s input rank from 5 to 4 provided by hailomz? Alternatively, in the ONNX files downloaded by hailomz or the Model Explorer, assumed to have already been converted for Hailo-8? In other words, is hailomz parse r3d_18 not usable as-is and must I perform the rank conversion myself? Please let me know.

Thank you.

I just tested this in the Hailo AI Software Suite Docker 2025-04 and it works as expected. Maybe you have a dependency issue. Can you install the Docker and try again?