Here’s what I understand is required to start the process:
Install
+Dataflow Compiler
+Model Zoo
Model conversion
.pt → .onnx → .hef
I’m currently starting to practice using this resource: วิธีการเทรน Model สำหรับ Raspberry Pi Ai Kit - Google Docs
Do you have any suggestions or other approaches?
But now I have a problem with
(env) pi@DESKTOP-99J9B6A:~/data$ hailomz compile yolov8n --ckpt=/home/pi/data/test.onnx --hw-arch hailo8 --calib-path /home/pi/data/license.v2i.yolov11/test/images --classes 1 --performance
Start run for network yolov8n …
Initializing the hailo8 runner…
[info] Translation started on ONNX model yolov8n
[info] Restored ONNX model yolov8n (completion time: 00:00:00.05)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.27)
[info] Simplified ONNX model for a parsing retry attempt (completion time: 00:00:01.43)
[info] According to recommendations, retrying parsing with end node names: [‘/model.22/Concat_3’].
[info] Translation started on ONNX model yolov8n
[info] Restored ONNX model yolov8n (completion time: 00:00:00.03)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.17)
[info] NMS structure of yolov8 (or equivalent architecture) was detected.
[info] In order to use HailoRT post-processing capabilities, these end node names should be used: /model.22/cv2.0/cv2.0.2/Conv /model.22/cv3.0/cv3.0.2/Conv /model.22/cv2.1/cv2.1.2/Conv /model.22/cv3.1/cv3.1.2/Conv /model.22/cv2.2/cv2.2.2/Conv /model.22/cv3.2/cv3.2.2/Conv.
[info] Start nodes mapped from original model: ‘images’: ‘yolov8n/input_layer1’.
[info] End nodes mapped from original model: ‘/model.22/Concat_3’.
[info] Translation completed on ONNX model yolov8n (completion time: 00:00:00.98)
[info] Translation started on ONNX model yolov8n
[info] Restored ONNX model yolov8n (completion time: 00:00:00.03)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.20)
[info] NMS structure of yolov8 (or equivalent architecture) was detected.
[info] In order to use HailoRT post-processing capabilities, these end node names should be used: /model.22/cv2.0/cv2.0.2/Conv /model.22/cv3.0/cv3.0.2/Conv /model.22/cv2.1/cv2.1.2/Conv /model.22/cv3.1/cv3.1.2/Conv /model.22/cv2.2/cv2.2.2/Conv /model.22/cv3.2/cv3.2.2/Conv.
[info] Start nodes mapped from original model: ‘images’: ‘yolov8n/input_layer1’.
[info] End nodes mapped from original model: ‘/model.22/cv2.0/cv2.0.2/Conv’, ‘/model.22/cv3.0/cv3.0.2/Conv’, ‘/model.22/cv2.1/cv2.1.2/Conv’, ‘/model.22/cv3.1/cv3.1.2/Conv’, ‘/model.22/cv2.2/cv2.2.2/Conv’, ‘/model.22/cv3.2/cv3.2.2/Conv’.
[info] Translation completed on ONNX model yolov8n (completion time: 00:00:00.86)
[info] Appending model script commands to yolov8n from string
[info] Added nms postprocess command to model script.
[info] Saved HAR to: /home/pi/data/yolov8n.har
Using generic alls script found in /home/pi/data/hailo_model_zoo/hailo_model_zoo/cfg/alls/generic/yolov8n.alls because there is no specific hardware alls
Preparing calibration data…
[info] Loading model script commands to yolov8n from /home/pi/data/hailo_model_zoo/hailo_model_zoo/cfg/alls/generic/yolov8n.alls
[info] Loading model script commands to yolov8n from string
[info] Found model with 3 input channels, using real RGB images for calibration instead of sampling random data.
[info] Starting Model Optimization
[warning] Reducing optimization level to 0 (the accuracy won’t be optimized and compression won’t be used) because there’s less data than the recommended amount (1024), and there’s no available GPU
[warning] Running model optimization with zero level of optimization is not recommended for production use and might lead to suboptimal accuracy results
[info] Model received quantization params from the hn
Traceback (most recent call last):
File “/home/pi/data/env/bin/hailomz”, line 33, in
sys.exit(load_entry_point(‘hailo-model-zoo’, ‘console_scripts’, ‘hailomz’)())
File “/home/pi/data/hailo_model_zoo/hailo_model_zoo/main.py”, line 122, in main
run(args)
File “/home/pi/data/hailo_model_zoo/hailo_model_zoo/main.py”, line 111, in run
return handlersargs.command
File “/home/pi/data/hailo_model_zoo/hailo_model_zoo/main_driver.py”, line 248, in compile
ensure_optimized(runner, logger, args, network_info)
File “/home/pi/data/hailo_model_zoo/hailo_model_zoo/main_driver.py”, line 91, in ensure_optimized
optimize_model(
File “/home/pi/data/hailo_model_zoo/hailo_model_zoo/core/main_utils.py”, line 353, in optimize_model
runner.optimize(calib_feed_callback)
File “/home/pi/data/env/lib/python3.10/site-packages/hailo_sdk_common/states/states.py”, line 16, in wrapped_func
return func(self, *args, **kwargs)
File “/home/pi/data/env/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py”, line 2201, in optimize
result = self.optimize(
File “/home/pi/data/env/lib/python3.10/site-packages/hailo_sdk_common/states/states.py”, line 16, in wrapped_func
return func(self, *args, **kwargs)
File “/home/pi/data/env/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py”, line 2020, in optimize
checkpoint_info = self.sdk_backend.full_quantization(
File “/home/pi/data/env/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py”, line 1196, in full_quantization
new_checkpoint_info = self.full_acceleras_run(
File “/home/pi/data/env/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py”, line 1434, in full_acceleras_run
new_checkpoint_info = self.optimization_flow_runner(optimization_flow, checkpoint_info)
File “/home/pi/data/env/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py”, line 2088, in optimization_flow_runner
optimization_flow.run()
File “/home/pi/data/env/lib/python3.10/site-packages/hailo_model_optimization/tools/orchestator.py”, line 239, in wrapper
return func(self, *args, **kwargs)
File “/home/pi/data/env/lib/python3.10/site-packages/hailo_model_optimization/flows/optimization_flow.py”, line 357, in run
step_func()
File “/home/pi/data/env/lib/python3.10/site-packages/hailo_model_optimization/tools/subprocess_wrapper.py”, line 154, in parent_wrapper
self.build_model()
File “/home/pi/data/env/lib/python3.10/site-packages/hailo_model_optimization/flows/optimization_flow.py”, line 260, in build_model
model.compute_output_shape(shapes)
File “/home/pi/data/env/lib/python3.10/site-packages/hailo_model_optimization/acceleras/model/hailo_model/hailo_model.py”, line 1153, in compute_output_shape
return self.compute_and_verify_output_shape(input_shape, verify_layer_inputs_shape=False)
File “/home/pi/data/env/lib/python3.10/site-packages/hailo_model_optimization/acceleras/model/hailo_model/hailo_model.py”, line 1187, in compute_and_verify_output_shape
layer_output_shape = layer.compute_output_shape(layer_input_shapes)
File “/home/pi/data/env/lib/python3.10/site-packages/keras/engine/base_layer.py”, line 917, in compute_output_shape
outputs = self(inputs, training=False)
File “/home/pi/data/env/lib/python3.10/site-packages/keras/utils/traceback_utils.py”, line 70, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/tmp/autograph_generated_fileqjdkctya.py", line 41, in tf__call
outputs = ag.converted_call(ag.ld(self).call_core, (ag.ld(inputs), ag.ld(training)), dict(**ag.ld(kwargs)), fscope)
File "/tmp/autograph_generated_file68n7lryb.py", line 90, in tf__call_core
ag.if_stmt(ag_.ld(self).postprocess_type in [ag__.ld(PostprocessType).NMS, ag__.ld(PostprocessType).BBOX_DECODER], if_body_3, else_body_3, get_state_3, set_state_3, (‘do_return’, ‘retval_’), 2)
File "/tmp/autograph_generated_file68n7lryb.py", line 22, in if_body_3
retval = ag_.converted_call(ag__.ld(self).bbox_decoding_and_nms_call, (ag__.ld(inputs),), dict(is_bbox_decoding_only=ag__.ld(self).postprocess_type == ag__.ld(PostprocessType).BBOX_DECODER), fscope)
File "/tmp/autograph_generated_fileii1qu4ol.py", line 116, in tf__bbox_decoding_and_nms_call
ag.if_stmt(ag__.ld(self).meta_arch in [ag__.ld(NMSOnCpuMetaArchitectures).YOLOV5, ag__.ld(NMSOnCpuMetaArchitectures).YOLOX], if_body_5, else_body_5, get_state_5, set_state_5, (‘decoded_bboxes’, ‘detection_score’, ‘do_return’, ‘retval_’, ‘inputs’), 4)
File "/tmp/autograph_generated_fileii1qu4ol.py", line 113, in else_body_5
ag.if_stmt(ag__.ld(self).meta_arch == ag__.ld(NMSOnCpuMetaArchitectures).YOLOV5_SEG, if_body_4, else_body_4, get_state_4, set_state_4, (‘decoded_bboxes’, ‘detection_score’, ‘do_return’, ‘retval_’), 4)
File "/tmp/autograph_generated_fileii1qu4ol.py", line 110, in else_body_4
ag.if_stmt(ag__.ld(self).meta_arch == ag__.ld(NMSOnCpuMetaArchitectures).YOLOV8, if_body_3, else_body_3, get_state_3, set_state_3, (‘decoded_bboxes’, ‘detection_score’), 2)
File "/tmp/autograph_generated_fileii1qu4ol.py", line 69, in if_body_3
(decoded_bboxes, detection_score) = ag.converted_call(ag__.ld(self).yolov8_decoding_call, (ag__.ld(inputs),), dict(offsets=[0.5, 0.5]), fscope)
File "/tmp/autograph_generated_fileem_hqqy9.py", line 87, in tf__yolov8_decoding_call
decoded_bboxes = ag.converted_call(ag__.ld(tf).expand_dims, (ag__.ld(decoded_bboxes),), dict(axis=2), fscope)
ValueError: Exception encountered when calling layer “yolov8_nms_postprocess” (type HailoPostprocess).
in user code:
File "/home/pi/data/env/lib/python3.10/site-packages/hailo_model_optimization/acceleras/hailo_layers/base_hailo_none_nn_core_layer.py", line 45, in call *
outputs = self.call_core(inputs, training, **kwargs)
File "/home/pi/data/env/lib/python3.10/site-packages/hailo_model_optimization/acceleras/hailo_layers/hailo_postprocess.py", line 123, in call_core *
is_bbox_decoding_only=self.postprocess_type == PostprocessType.BBOX_DECODER,
File "/home/pi/data/env/lib/python3.10/site-packages/hailo_model_optimization/acceleras/hailo_layers/hailo_postprocess.py", line 157, in bbox_decoding_and_nms_call *
decoded_bboxes, detection_score = self.yolov8_decoding_call(inputs, offsets=[0.5, 0.5])
File "/home/pi/data/env/lib/python3.10/site-packages/hailo_model_optimization/acceleras/hailo_layers/hailo_postprocess.py", line 375, in yolov8_decoding_call *
decoded_bboxes = tf.expand_dims(decoded_bboxes, axis=2)
ValueError: Tried to convert 'input' to a tensor and failed. Error: None values not supported.
Call arguments received by layer “yolov8_nms_postprocess” (type HailoPostprocess):
• inputs=[‘tf.Tensor(shape=(None, 80, 80, 64), dtype=float32)’, ‘tf.Tensor(shape=(None, 80, 80, 2), dtype=float32)’, ‘tf.Tensor(shape=(None, 40, 40, 64), dtype=float32)’, ‘tf.Tensor(shape=(None, 40, 40, 2), dtype=float32)’, ‘tf.Tensor(shape=(None, 20, 20, 64), dtype=float32)’, ‘tf.Tensor(shape=(None, 20, 20, 2), dtype=float32)’]
• training=False
• kwargs=<class ‘inspect._empty’>