TAPPAS LPR pipeline for RPI 5 HALO8L

Hello,

I am working on a project with very tight deadline (this Sunday maybe that’s why I am putting this question at 3:15 AM).

I am looking to implement a solution similar to TAPPAS LPR

I have Raspberry Pi 5 with HAILO 8L. Since this morning I have been trying to work (this entire thing is new for me) and I have realised few things.

  • HEF is HAILO Executable File which is different from ONNX or PyTorch file extensions.
  • DataFlow Compiler have been announced and can be used to convert the ONNX or other models into HEF compatible format
  • HAILO8 compiled .hef files cannot be used on HAILO8L which Raspberry is running
  • I was able to install TAPPAS on RPI with manual installation (however since the existing models have HAILO8 hef files, I cannot use any of the model especially the LPR which I wanted to run

Now, what I am looking for is

  • Can I convert HAILO8 model to HAILO8L model with DFC?
  • Can someone from the community help me in creating a TAPPAS type pipeline for LPR (which is recognising vehicle, detecting number plate and recognising Number) for rpi5?

This is super important any help or Guidance will be highly appreciated!

Thanks

Also, I am not able to edit my original post but I did look at the Model Zoo where we have publicly available HAILO8L models there we have a section for LPR but the model listed in this page (.hef) is also not compatible with HAILO8L

I have been able to convert a custom trained Yolov8m model to HAILO8L and able to run it however when I tried to do the same on lprnet model I got the following error

(env) deep@deepVM:~/hailo$ hailomz compile --ckpt lprnet.onnx --hw-arch hailo8l --calib-path calib/ --yaml hailo_model_zoo/hailo_model_zoo/cfg/networks/lprnet.yaml
[warning] Cannot use graphviz, so no visualizations will be created
<Hailo Model Zoo INFO> Start run for network lprnet ...
<Hailo Model Zoo INFO> Initializing the hailo8l runner...
[info] Translation started on ONNX model lprnet
[info] Restored ONNX model lprnet (completion time: 00:00:00.32)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.92)
[info] Simplified ONNX model for a parsing retry attempt (completion time: 00:00:01.67)
Traceback (most recent call last):
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 179, in translate_onnx_model
    parsing_results = self._parse_onnx_model_to_hn(onnx_model, valid_net_name, start_node_names,
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 237, in _parse_onnx_model_to_hn
    return self.parse_model_to_hn(onnx_model, None, net_name, start_node_names, end_node_names,
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 255, in parse_model_to_hn
    converter = ONNXConverter(model=model,
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_translator.py", line 77, in __init__
    super(ONNXConverter, self).__init__(
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_client/model_translator/translator.py", line 48, in __init__
    self._calculate_valid_subgraph_scope()
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_client/model_translator/translator.py", line 422, in _calculate_valid_subgraph_scope
    current_vertex.in_valid_subgraph = True
AttributeError: 'NoneType' object has no attribute 'in_valid_subgraph'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/deep/hailo/env/bin/hailomz", line 33, in <module>
    sys.exit(load_entry_point('hailo-model-zoo', 'console_scripts', 'hailomz')())
  File "/home/deep/hailo/hailo_model_zoo/hailo_model_zoo/main.py", line 122, in main
    run(args)
  File "/home/deep/hailo/hailo_model_zoo/hailo_model_zoo/main.py", line 111, in run
    return handlers[args.command](args)
  File "/home/deep/hailo/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 250, in compile
    _ensure_optimized(runner, logger, args, network_info)
  File "/home/deep/hailo/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 73, in _ensure_optimized
    _ensure_parsed(runner, logger, network_info, args)
  File "/home/deep/hailo/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 108, in _ensure_parsed
    parse_model(runner, network_info, ckpt_path=args.ckpt_path, results_dir=args.results_dir, logger=logger)
  File "/home/deep/hailo/hailo_model_zoo/hailo_model_zoo/core/main_utils.py", line 97, in parse_model
    model_name = translate_model(runner, network_info, ckpt_path, tensor_shapes=start_node_shapes)
  File "/home/deep/hailo/hailo_model_zoo/hailo_model_zoo/utils/parse_utils.py", line 28, in translate_model
    runner.translate_onnx_model(
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_common/states/states.py", line 16, in wrapped_func
    return func(self, *args, **kwargs)
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_client/runner/client_runner.py", line 876, in translate_onnx_model
    parser.translate_onnx_model(model=model, net_name=net_name, start_node_names=start_node_names,
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 211, in translate_onnx_model
    parsing_results = self._parse_onnx_model_to_hn(simplified_model, valid_net_name,
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 237, in _parse_onnx_model_to_hn
    return self.parse_model_to_hn(onnx_model, None, net_name, start_node_names, end_node_names,
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 255, in parse_model_to_hn
    converter = ONNXConverter(model=model,
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_translator.py", line 77, in __init__
    super(ONNXConverter, self).__init__(
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_client/model_translator/translator.py", line 48, in __init__
    self._calculate_valid_subgraph_scope()
  File "/home/deep/hailo/env/lib/python3.8/site-packages/hailo_sdk_client/model_translator/translator.py", line 422, in _calculate_valid_subgraph_scope
    current_vertex.in_valid_subgraph = True
AttributeError: 'NoneType' object has no attribute 'in_valid_subgraph'

For people reading in the future, I installed specific version of PyTorch from here and then ran the export script and my onnx model conversion was successful. Then I ran this command

hailomz compile --ckpt lprnet.onnx --hw-arch hailo8l --calib-path calib/ --yaml hailo_model_zoo/hailo_model_zoo/cfg/networks/lprnet.yaml

The compilation has started. I will keep the thread posted with the output.

Update the hef file has been created successfully. It took 2-3 minutes which is very different from when I was running Yolov8m where it took 5 hours. Now I need to figure out how to run lprnet.hef on Raspberry pi 5.
Any suggestions?

Hi, You should be able to run the TAPPAS pipeline on the Pi.
You’ll need to change the hef paths and the postprocess path to point to new locations.
The post process are installed on the Pi as part of tappas_core. See hailo-rpi5-example setup_env.sh script to see how to get them.
TAPPAS_POST_PROC_DIR set to /usr/lib/aarch64-linux-gnu/hailo/tappas/post_processes
Note that one of them is hiding in the croping algorithms dir: /usr/lib/aarch64-linux-gnu/hailo/tappas/post_processes/cropping_algorithms/liblpr_croppers.so

You can also use the hailo-rpi5-example as a boilerplate for the LPR pipeline. You’ll need to change the get_pipeline_string() function to return the string for the lpr pipeline. You can generate it using the bash and just return it as is. (might need some adjustments to support some features in the example like FPS and callback for example).

Looking forward to see your project. Note that you’ll need to retrain the lprnet and maybe edit the post process as it currently built for Israeli license plates.

Hey thank you for the response @giladn

I tried to run the TAPPS and it did not work. And it involves a lot of cross compilation as well as I need to recompile other 2 models for HAILO8L.

I then decided to use the hailo-rpi5-example as I find myself more comfortable with that but unfortunately I am stuck at creation of lpr string? I’m figuring it out (maybe I too much sleep deprived to do so at the moment).

Just curious is it possible for you to point to an example or even an lpr string itself where I can run the Yolov8m along with LPR? I was able to run the Yolov8m for licence plate detection by compiling it and using the detection.py file with resources and hef as an argument from rpi5 examples. Any help will be really really appreciated :slightly_smiling_face:

See below snip from the video I am able to run with my trained Yolov8m

Hi, I don’t understand what you are looking for.
What do you mean by LPR string? the gstereamer pipeline string?
You should be ok with just replacing the yolov5m_vehicles.hef with yolov8. You will need to remove detections which are not ‘car’, ‘truck’, ‘bus’ etc. not sure if you’ll need to merge them all the one class. you can do it in the user callback. you should move it before the tracker.
You can also retrain and create a version for H8L see vehicle detection retraining guide