I get this warning message when running hailomz optimize. How to make sure it is able to use GPU?
Hi,
The best option is to use the AI SW suite docker, where aall the needed CUDA libraries and the right versions are installed. You can refer to the AI SW suite docs for a list of supported GPU models by our SW.
Hi Nadav, I tried using the docker for generating custom HEF. Everything went fine till I generated the .pt model and was also able to export the model to onnx format. But when I tried to use hailomz to parse and optimise the model to convert to .har archive, I got some NoneType errors from model zoo libraries. That’s why I was trying this way.
Oh, that’s too bad. I’m not sure what caused that.
What is the GPU model that you have? Is that a supported type?
Its GTX 1650. Where can I check for supported models?
This is the output after running hailomz compile:
(hailo_dfc) professor_paradox@omnitrix:~/Documents/retinex/hailo_model_zoo$ hailomz compile --ckpt yolov8n.onnx --calib-path …/images/ --yaml hailo_model_zoo/cfg/networks/yolov8n_seg.yaml --classes 1
Start run for network yolov8n_seg …
Initializing the hailo8 runner…
[info] Translation started on ONNX model yolov8n_seg
[info] Restored ONNX model yolov8n_seg (completion time: 00:00:00.09)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.47)
[info] Simplified ONNX model for a parsing retry attempt (completion time: 00:00:01.17)
Traceback (most recent call last):
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 179, in translate_onnx_model
parsing_results = self._parse_onnx_model_to_hn(onnx_model, valid_net_name, start_node_names,
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 237, in _parse_onnx_model_to_hn
return self.parse_model_to_hn(onnx_model, None, net_name, start_node_names, end_node_names,
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 255, in parse_model_to_hn
converter = ONNXConverter(model=model,
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_translator.py”, line 77, in init
super(ONNXConverter, self).init(
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_client/model_translator/translator.py”, line 48, in init
self._calculate_valid_subgraph_scope()
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_client/model_translator/translator.py”, line 422, in _calculate_valid_subgraph_scope
current_vertex.in_valid_subgraph = True
AttributeError: ‘NoneType’ object has no attribute ‘in_valid_subgraph’
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File “/home/professor_paradox/Documents/retinex/hailo_dfc/bin/hailomz”, line 33, in
sys.exit(load_entry_point(‘hailo-model-zoo’, ‘console_scripts’, ‘hailomz’)())
File “/home/professor_paradox/Documents/retinex/hailo_model_zoo/hailo_model_zoo/main.py”, line 122, in main
run(args)
File “/home/professor_paradox/Documents/retinex/hailo_model_zoo/hailo_model_zoo/main.py”, line 111, in run
return handlersargs.command
File “/home/professor_paradox/Documents/retinex/hailo_model_zoo/hailo_model_zoo/main_driver.py”, line 250, in compile
_ensure_optimized(runner, logger, args, network_info)
File “/home/professor_paradox/Documents/retinex/hailo_model_zoo/hailo_model_zoo/main_driver.py”, line 73, in _ensure_optimized
_ensure_parsed(runner, logger, network_info, args)
File “/home/professor_paradox/Documents/retinex/hailo_model_zoo/hailo_model_zoo/main_driver.py”, line 108, in _ensure_parsed
parse_model(runner, network_info, ckpt_path=args.ckpt_path, results_dir=args.results_dir, logger=logger)
File “/home/professor_paradox/Documents/retinex/hailo_model_zoo/hailo_model_zoo/core/main_utils.py”, line 97, in parse_model
model_name = translate_model(runner, network_info, ckpt_path, tensor_shapes=start_node_shapes)
File “/home/professor_paradox/Documents/retinex/hailo_model_zoo/hailo_model_zoo/utils/parse_utils.py”, line 28, in translate_model
runner.translate_onnx_model(
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_common/states/states.py”, line 16, in wrapped_func
return func(self, *args, **kwargs)
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py”, line 876, in translate_onnx_model
parser.translate_onnx_model(model=model, net_name=net_name, start_node_names=start_node_names,
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 211, in translate_onnx_model
parsing_results = self._parse_onnx_model_to_hn(simplified_model, valid_net_name,
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 237, in _parse_onnx_model_to_hn
return self.parse_model_to_hn(onnx_model, None, net_name, start_node_names, end_node_names,
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py”, line 255, in parse_model_to_hn
converter = ONNXConverter(model=model,
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_client/model_translator/onnx_translator/onnx_translator.py”, line 77, in init
super(ONNXConverter, self).init(
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_client/model_translator/translator.py”, line 48, in init
self._calculate_valid_subgraph_scope()
File “/home/professor_paradox/Documents/retinex/hailo_dfc/lib/python3.10/site-packages/hailo_sdk_client/model_translator/translator.py”, line 422, in _calculate_valid_subgraph_scope
current_vertex.in_valid_subgraph = True
AttributeError: ‘NoneType’ object has no attribute ‘in_valid_subgraph’
@shubham
Have you got the solution? I have also GTX1650. And can not compile the model without GPU. Getting some errors.
Although On the hailo docs It’s mentioned as optional.
Hello @saurabh. I could not figure out the issue. So I just created a docker file which creates the environment suitable for hailo tools. You can try this.
[Dockerfile]
@shubham
Thak you so much for the response. Are you able to compile the model? Now? I am trying to compile a yolov8 pose model. But can’t. I am trying to find guides.
I tried docker suite. also tried to compile directly with hailo dfc without model zoo. I was able to perform quantization without calib data. But not able to compile to hef due to non supported operations errors in the model.
@saurabh
I don’t know if its the correct way or not but I did this and was able to get the .hef. I installed all the dependencies of hailo tools inside the docker container. Dockerfile - Google Drive
Build docker: ```
docker build -t ubuntu-cuda-cudnn-python3.10 .
Run docker: ```
docker run --gpus all -it ubuntu-cuda-cudnn-python3.10
Install hailo DFC and model zoo in the container. Run the parse, optimize, compile there with calibration data. You’ll get the .hef.
Thank you, I will try.