I trained a model with yolov8s and yolov8n using the instructions and dockerfile from here: hailo_model_zoo/training/yolov8 at master · hailo-ai/hailo_model_zoo · GitHub
In Yolo Docker:
yolo detect train data=/app/yolotrain.yaml model=yolov8s epochs=100 batch=16 imgsz=446 name=8s_v1.0
yolo detect train data=/app/yolotrain.yaml model=yolov8n epochs=100 batch=16 imgsz=446 name=8n_v1.0
yolo export model=/workspace/ultralytics/runs/detect/8n_v1.0/weights/best.pt imgsz=448 format=onnx opset=11
yolo export model=/workspace/ultralytics/runs/detect/8s_v1.0/weights/best.pt imgsz=448 format=onnx opset=11
I then copied those to my hailo_ai_sw_2024-07.1_container
so that I can compile (or parse, optimize, then compile, I’m still a little foggy on why some tutorials skip straight to compile and others parse and optimize)
I updated my yolo.yaml to account for the new input size:
base:
- base/coco.yaml
preprocessing:
network_type: detection
input_shape:
- 448
- 448
- 3
meta_arch: yolo_v5
padding_color: 114
postprocessing:
nms_iou_thresh: 0.45
score_threshold: 0.01
meta_arch: yolo_v5
anchors:
strides:
- 8
- 16
- 32
sizes:
- - 10
- 13
- 16
- 30
- 33
- 23
- - 30
- 61
- 62
- 45
- 59
- 119
- - 116
- 90
- 156
- 198
- 373
- 326
hpp: false
info:
source: https://github.com/ultralytics/yolov5/releases/tag/v2.0
parser:
normalization_params:
normalize_in_net: true
mean_list:
- 0.0
- 0.0
- 0.0
std_list:
- 255.0
- 255.0
- 255.0
evaluation:
labels_offset: 1
classes: 80
dataset_name: coco_2017_detection
Parsing works properly:
hailomz parse --ckpt 8s_v1.0_hel.onnx --hw-arch hailo8l yolov8s
<Hailo Model Zoo INFO> Start run for network yolov8s ...
<Hailo Model Zoo INFO> Initializing the runner...
[info] Translation started on ONNX model yolov8s
[info] Restored ONNX model yolov8s (completion time: 00:00:00.24)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.86)
[info] NMS structure of yolov8 (or equivalent architecture) was detected.
[info] In order to use HailoRT post-processing capabilities, these end node names should be used: /model.22/cv2.0/cv2.0.2/Conv /model.22/cv3.0/cv3.0.2/Conv /model.22/cv2.1/cv2.1.2/Conv /model.22/cv3.1/cv3.1.2/Conv /model.22/cv2.2/cv2.2.2/Conv /model.22/cv3.2/cv3.2.2/Conv.
[info] Start nodes mapped from original model: 'images': 'yolov8s/input_layer1'.
[info] End nodes mapped from original model: '/model.22/cv2.0/cv2.0.2/Conv', '/model.22/cv3.0/cv3.0.2/Conv', '/model.22/cv2.1/cv2.1.2/Conv', '/model.22/cv3.1/cv3.1.2/Conv', '/model.22/cv2.2/cv2.2.2/Conv', '/model.22/cv3.2/cv3.2.2/Conv'.
[info] Translation completed on ONNX model yolov8s (completion time: 00:00:01.56)
[info] Saved HAR to: /jtech/yolov8s.har
and
hailomz parse --ckp 8n_v1.0_hel.onnx --hw-arch hailo8l yolov8n
<Hailo Model Zoo INFO> Start run for network yolov8n ...
<Hailo Model Zoo INFO> Initializing the runner...
[info] Translation started on ONNX model yolov8n
[info] Restored ONNX model yolov8n (completion time: 00:00:00.04)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.24)
[info] NMS structure of yolov8 (or equivalent architecture) was detected.
[info] In order to use HailoRT post-processing capabilities, these end node names should be used: /model.22/cv2.0/cv2.0.2/Conv /model.22/cv3.0/cv3.0.2/Conv /model.22/cv2.1/cv2.1.2/Conv /model.22/cv3.1/cv3.1.2/Conv /model.22/cv2.2/cv2.2.2/Conv /model.22/cv3.2/cv3.2.2/Conv.
[info] Start nodes mapped from original model: 'images': 'yolov8n/input_layer1'.
[info] End nodes mapped from original model: '/model.22/cv2.0/cv2.0.2/Conv', '/model.22/cv3.0/cv3.0.2/Conv', '/model.22/cv2.1/cv2.1.2/Conv', '/model.22/cv3.1/cv3.1.2/Conv', '/model.22/cv2.2/cv2.2.2/Conv', '/model.22/cv3.2/cv3.2.2/Conv'.
[info] Translation completed on ONNX model yolov8n (completion time: 00:00:00.81)
[info] Saved HAR to: /jtech/yolov8n.har
but I get the same error if I try and compile or optimize:
hailomz compile yolov8s --hw-arch hailo8l --har ./yolov8s.har --calib-path ./training/yolo/images/
<Hailo Model Zoo INFO> Start run for network yolov8s ...
<Hailo Model Zoo INFO> Initializing the runner...
[info] Translation started on ONNX model yolov8s
[info] Restored ONNX model yolov8s (completion time: 00:00:00.15)
[info] Extracted ONNXRuntime meta-data for Hailo model (completion time: 00:00:00.65)
[info] NMS structure of yolov8 (or equivalent architecture) was detected.
[info] In order to use HailoRT post-processing capabilities, these end node names should be used: /model.22/cv2.0/cv2.0.2/Conv /model.22/cv3.0/cv3.0.2/Conv /model.22/cv2.1/cv2.1.2/Conv /model.22/cv3.1/cv3.1.2/Conv /model.22/cv2.2/cv2.2.2/Conv /model.22/cv3.2/cv3.2.2/Conv.
[info] Start nodes mapped from original model: 'images': 'yolov8s/input_layer1'.
[info] End nodes mapped from original model: '/model.22/cv2.0/cv2.0.2/Conv', '/model.22/cv3.0/cv3.0.2/Conv', '/model.22/cv2.1/cv2.1.2/Conv', '/model.22/cv3.1/cv3.1.2/Conv', '/model.22/cv2.2/cv2.2.2/Conv', '/model.22/cv3.2/cv3.2.2/Conv'.
[info] Translation completed on ONNX model yolov8s (completion time: 00:00:01.38)
[info] Saved HAR to: /jtech/yolov8s.har
(hailo_virtualenv) hailo@docker-desktop:/jtech$ hailomz compile yolov8s --hw-arch hailo8l --har ./yolov8s.har --calib-path ./training/yolo/images/
<Hailo Model Zoo INFO> Start run for network yolov8s ...
<Hailo Model Zoo INFO> Initializing the hailo8l runner...
<Hailo Model Zoo INFO> Preparing calibration data...
[info] Loading model script commands to yolov8s from /local/workspace/hailo_model_zoo/hailo_model_zoo/cfg/alls/generic/yolov8s.alls
[info] Starting Model Optimization
[warning] Reducing optimization level to 0 (the accuracy won't be optimized and compression won't be used) because there's no available GPU
[warning] Running model optimization with zero level of optimization is not recommended for production use and might lead to suboptimal accuracy results
[info] Model received quantization params from the hn
Traceback (most recent call last):
File "/local/workspace/hailo_virtualenv/bin/hailomz", line 33, in <module>
sys.exit(load_entry_point('hailo-model-zoo', 'console_scripts', 'hailomz')())
File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py", line 122, in main
run(args)
File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py", line 111, in run
return handlers[args.command](args)
File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 250, in compile
_ensure_optimized(runner, logger, args, network_info)
File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 91, in _ensure_optimized
optimize_model(
File "/local/workspace/hailo_model_zoo/hailo_model_zoo/core/main_utils.py", line 321, in optimize_model
runner.optimize(calib_feed_callback)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_common/states/states.py", line 16, in wrapped_func
return func(self, *args, **kwargs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/runner/client_runner.py", line 2093, in optimize
self._optimize(calib_data, data_type=data_type, work_dir=work_dir)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_common/states/states.py", line 16, in wrapped_func
return func(self, *args, **kwargs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/runner/client_runner.py", line 1935, in _optimize
self._sdk_backend.full_quantization(calib_data, data_type=data_type, work_dir=work_dir)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py", line 1045, in full_quantization
self._full_acceleras_run(self.calibration_data, data_type)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py", line 1229, in _full_acceleras_run
optimization_flow.run()
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/tools/orchestator.py", line 306, in wrapper
return func(self, *args, **kwargs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/flows/optimization_flow.py", line 316, in run
step_func()
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/tools/orchestator.py", line 250, in wrapped
result = method(*args, **kwargs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/tools/subprocess_wrapper.py", line 121, in parent_wrapper
self.build_model()
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/tools/orchestator.py", line 250, in wrapped
result = method(*args, **kwargs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/flows/optimization_flow.py", line 230, in build_model
model.compute_output_shape(shapes)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/model/hailo_model/hailo_model.py", line 1013, in compute_output_shape
return self.compute_and_verify_output_shape(input_shape, verify_layer_inputs_shape=False)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/model/hailo_model/hailo_model.py", line 1047, in compute_and_verify_output_shape
layer_output_shape = layer.compute_output_shape(layer_input_shapes)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/keras/engine/base_layer.py", line 917, in compute_output_shape
outputs = self(inputs, training=False)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/keras/utils/traceback_utils.py", line 70, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/tmp/__autograph_generated_filevv_9ae60.py", line 41, in tf__call
outputs = ag__.converted_call(ag__.ld(self).call_core, (ag__.ld(inputs), ag__.ld(training)), dict(**ag__.ld(kwargs)), fscope)
File "/tmp/__autograph_generated_file5sv35h47.py", line 90, in tf__call_core
ag__.if_stmt((ag__.ld(self).postprocess_type in [ag__.ld(PostprocessType).NMS, ag__.ld(PostprocessType).BBOX_DECODER]), if_body_3, else_body_3, get_state_3, set_state_3, ('do_return', 'retval_'), 2)
File "/tmp/__autograph_generated_file5sv35h47.py", line 22, in if_body_3
retval_ = ag__.converted_call(ag__.ld(self).bbox_decoding_and_nms_call, (ag__.ld(inputs),), dict(is_bbox_decoding_only=(ag__.ld(self).postprocess_type == ag__.ld(PostprocessType).BBOX_DECODER)), fscope)
File "/tmp/__autograph_generated_filexgc1j34i.py", line 99, in tf__bbox_decoding_and_nms_call
ag__.if_stmt((ag__.ld(self).meta_arch in [ag__.ld(NMSOnCpuMetaArchitectures).YOLOV5, ag__.ld(NMSOnCpuMetaArchitectures).YOLOX]), if_body_4, else_body_4, get_state_4, set_state_4, ('decoded_bboxes', 'detection_score', 'do_return', 'retval_', 'inputs'), 4)
File "/tmp/__autograph_generated_filexgc1j34i.py", line 96, in else_body_4
ag__.if_stmt((ag__.ld(self).meta_arch == ag__.ld(NMSOnCpuMetaArchitectures).YOLOV5_SEG), if_body_3, else_body_3, get_state_3, set_state_3, ('decoded_bboxes', 'detection_score', 'do_return', 'retval_'), 4)
File "/tmp/__autograph_generated_filexgc1j34i.py", line 93, in else_body_3
ag__.if_stmt((ag__.ld(self).meta_arch == ag__.ld(NMSOnCpuMetaArchitectures).YOLOV8), if_body_2, else_body_2, get_state_2, set_state_2, ('decoded_bboxes', 'detection_score'), 2)
File "/tmp/__autograph_generated_filexgc1j34i.py", line 69, in if_body_2
(decoded_bboxes, detection_score) = ag__.converted_call(ag__.ld(self).yolov8_decoding_call, (ag__.ld(inputs),), None, fscope)
File "/tmp/__autograph_generated_file7isggz0k.py", line 67, in tf__yolov8_decoding_call
decoded_bboxes = ag__.converted_call(ag__.ld(tf).expand_dims, (ag__.ld(decoded_bboxes),), dict(axis=2), fscope)
ValueError: Exception encountered when calling layer "yolov8_nms_postprocess" (type HailoPostprocess).
in user code:
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/hailo_layers/base_hailo_none_nn_core_layer.py", line 43, in call *
outputs = self.call_core(inputs, training, **kwargs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/hailo_layers/hailo_postprocess.py", line 113, in call_core *
inputs,
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/hailo_layers/hailo_postprocess.py", line 148, in bbox_decoding_and_nms_call *
decoded_bboxes, detection_score = self.yolov8_decoding_call(inputs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/hailo_layers/hailo_postprocess.py", line 355, in yolov8_decoding_call *
decoded_bboxes = tf.expand_dims(decoded_bboxes, axis=2)
ValueError: Tried to convert 'input' to a tensor and failed. Error: None values not supported.
Call arguments received by layer "yolov8_nms_postprocess" (type HailoPostprocess):
• inputs=['tf.Tensor(shape=(None, 56, 56, 64), dtype=float32)', 'tf.Tensor(shape=(None, 56, 56, 2), dtype=float32)', 'tf.Tensor(shape=(None, 28, 28, 64), dtype=float32)', 'tf.Tensor(shape=(None, 28, 28, 2), dtype=float32)', 'tf.Tensor(shape=(None, 14, 14, 64), dtype=float32)', 'tf.Tensor(shape=(None, 14, 14, 2), dtype=float32)']
• training=False
• kwargs=<class 'inspect._empty'>
and for 8n:
hailomz compile yolov8n --hw-arch hailo8l --har ./yolov8n.har --calib-pa
th ./training/yolo/images/
<Hailo Model Zoo INFO> Start run for network yolov8n ...
<Hailo Model Zoo INFO> Initializing the hailo8l runner...
<Hailo Model Zoo INFO> Preparing calibration data...
[info] Loading model script commands to yolov8n from /local/workspace/hailo_model_zoo/hailo_model_zoo/cfg/alls/generic/yolov8n.alls
[info] Starting Model Optimization
[warning] Reducing optimization level to 0 (the accuracy won't be optimized and compression won't be used) because there's no available GPU
[warning] Running model optimization with zero level of optimization is not recommended for production use and might lead to suboptimal accuracy results
[info] Model received quantization params from the hn
Traceback (most recent call last):
File "/local/workspace/hailo_virtualenv/bin/hailomz", line 33, in <module>
sys.exit(load_entry_point('hailo-model-zoo', 'console_scripts', 'hailomz')())
File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py", line 122, in main
run(args)
File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py", line 111, in run
return handlers[args.command](args)
File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 250, in compile
_ensure_optimized(runner, logger, args, network_info)
File "/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py", line 91, in _ensure_optimized
optimize_model(
File "/local/workspace/hailo_model_zoo/hailo_model_zoo/core/main_utils.py", line 321, in optimize_model
runner.optimize(calib_feed_callback)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_common/states/states.py", line 16, in wrapped_func
return func(self, *args, **kwargs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/runner/client_runner.py", line 2093, in optimize
self._optimize(calib_data, data_type=data_type, work_dir=work_dir)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_common/states/states.py", line 16, in wrapped_func
return func(self, *args, **kwargs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/runner/client_runner.py", line 1935, in _optimize
self._sdk_backend.full_quantization(calib_data, data_type=data_type, work_dir=work_dir)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py", line 1045, in full_quantization
self._full_acceleras_run(self.calibration_data, data_type)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py", line 1229, in _full_acceleras_run
optimization_flow.run()
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/tools/orchestator.py", line 306, in wrapper
return func(self, *args, **kwargs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/flows/optimization_flow.py", line 316, in run
step_func()
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/tools/orchestator.py", line 250, in wrapped
result = method(*args, **kwargs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/tools/subprocess_wrapper.py", line 121, in parent_wrapper
self.build_model()
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/tools/orchestator.py", line 250, in wrapped
result = method(*args, **kwargs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/flows/optimization_flow.py", line 230, in build_model
model.compute_output_shape(shapes)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/model/hailo_model/hailo_model.py", line 1013, in compute_output_shape
return self.compute_and_verify_output_shape(input_shape, verify_layer_inputs_shape=False)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/model/hailo_model/hailo_model.py", line 1047, in compute_and_verify_output_shape
layer_output_shape = layer.compute_output_shape(layer_input_shapes)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/keras/engine/base_layer.py", line 917, in compute_output_shape
outputs = self(inputs, training=False)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/keras/utils/traceback_utils.py", line 70, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/tmp/__autograph_generated_filezehnp_ey.py", line 41, in tf__call
outputs = ag__.converted_call(ag__.ld(self).call_core, (ag__.ld(inputs), ag__.ld(training)), dict(**ag__.ld(kwargs)), fscope)
File "/tmp/__autograph_generated_filezg3153uh.py", line 90, in tf__call_core
ag__.if_stmt((ag__.ld(self).postprocess_type in [ag__.ld(PostprocessType).NMS, ag__.ld(PostprocessType).BBOX_DECODER]), if_body_3, else_body_3, get_state_3, set_state_3, ('do_return', 'retval_'), 2)
File "/tmp/__autograph_generated_filezg3153uh.py", line 22, in if_body_3
retval_ = ag__.converted_call(ag__.ld(self).bbox_decoding_and_nms_call, (ag__.ld(inputs),), dict(is_bbox_decoding_only=(ag__.ld(self).postprocess_type == ag__.ld(PostprocessType).BBOX_DECODER)), fscope)
File "/tmp/__autograph_generated_filen0w1t1lj.py", line 99, in tf__bbox_decoding_and_nms_call
ag__.if_stmt((ag__.ld(self).meta_arch in [ag__.ld(NMSOnCpuMetaArchitectures).YOLOV5, ag__.ld(NMSOnCpuMetaArchitectures).YOLOX]), if_body_4, else_body_4, get_state_4, set_state_4, ('decoded_bboxes', 'detection_score', 'do_return', 'retval_', 'inputs'), 4)
File "/tmp/__autograph_generated_filen0w1t1lj.py", line 96, in else_body_4
ag__.if_stmt((ag__.ld(self).meta_arch == ag__.ld(NMSOnCpuMetaArchitectures).YOLOV5_SEG), if_body_3, else_body_3, get_state_3, set_state_3, ('decoded_bboxes', 'detection_score', 'do_return', 'retval_'), 4)
File "/tmp/__autograph_generated_filen0w1t1lj.py", line 93, in else_body_3
ag__.if_stmt((ag__.ld(self).meta_arch == ag__.ld(NMSOnCpuMetaArchitectures).YOLOV8), if_body_2, else_body_2, get_state_2, set_state_2, ('decoded_bboxes', 'detection_score'), 2)
File "/tmp/__autograph_generated_filen0w1t1lj.py", line 69, in if_body_2
(decoded_bboxes, detection_score) = ag__.converted_call(ag__.ld(self).yolov8_decoding_call, (ag__.ld(inputs),), None, fscope)
File "/tmp/__autograph_generated_filepm3y5f6c.py", line 67, in tf__yolov8_decoding_call
decoded_bboxes = ag__.converted_call(ag__.ld(tf).expand_dims, (ag__.ld(decoded_bboxes),), dict(axis=2), fscope)
ValueError: Exception encountered when calling layer "yolov8_nms_postprocess" (type HailoPostprocess).
in user code:
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/hailo_layers/base_hailo_none_nn_core_layer.py", line 43, in call *
outputs = self.call_core(inputs, training, **kwargs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/hailo_layers/hailo_postprocess.py", line 113, in call_core *
inputs,
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/hailo_layers/hailo_postprocess.py", line 148, in bbox_decoding_and_nms_call *
decoded_bboxes, detection_score = self.yolov8_decoding_call(inputs)
File "/local/workspace/hailo_virtualenv/lib/python3.8/site-packages/hailo_model_optimization/acceleras/hailo_layers/hailo_postprocess.py", line 355, in yolov8_decoding_call *
decoded_bboxes = tf.expand_dims(decoded_bboxes, axis=2)
ValueError: Tried to convert 'input' to a tensor and failed. Error: None values not supported.
Call arguments received by layer "yolov8_nms_postprocess" (type HailoPostprocess):
• inputs=['tf.Tensor(shape=(None, 56, 56, 64), dtype=float32)', 'tf.Tensor(shape=(None, 56, 56, 2), dtype=float32)', 'tf.Tensor(shape=(None, 28, 28, 64), dtype=float32)', 'tf.Tensor(shape=(None, 28, 28, 2), dtype=float32)', 'tf.Tensor(shape=(None, 14, 14, 64), dtype=float32)', 'tf.Tensor(shape=(None, 14, 14, 2), dtype=float32)']
• training=False
• kwargs=<class 'inspect._empty'>
I’m running Hailo Model Zoo v2.12.0 and Hailo AI SW Suite 2024-07.1