I have tried again today and it is back being unstable even with 16.1 of degirum, it hangs once a face is detected by the camera and embeddings are generated.. Not sure if this is a software or potential hardware issue at this point.
If you have any other suggestions please let me know.
If anyone else has a HAILO8 + rpi5 with hailort 4.20 and could try the below script let me know if it runs ok.
Details are as follows:
Python script output:
(venv_hailo_rpi5_examples) conorroche@picam1:~/hailo-rpi5-examples $ python digirum_rec.py
[0:11:14.441243921] [10027] INFO Camera camera_manager.cpp:326 libcamera v0.5.0+59-d83ff0a4
[0:11:14.448270206] [10056] INFO RPI pisp.cpp:720 libpisp version v1.2.1 981977ff21f3 29-04-2025 (14:13:50)
[0:11:14.457470400] [10056] INFO RPI pisp.cpp:1179 Registered camera /base/axi/pcie@1000120000/rp1/i2c@88000/ov5647@36 to CFE device /dev/media0 and ISP device /dev/media2 using PiSP variant BCM2712_D0
[0:11:14.461936923] [10027] INFO Camera camera.cpp:1205 configuring streams: (0) 640x480-RGB888 (1) 640x480-GBRG_PISP_COMP1
[0:11:14.462070720] [10056] INFO RPI pisp.cpp:1483 Sensor: /base/axi/pcie@1000120000/rp1/i2c@88000/ov5647@36 - Selected sensor format: 640x480-SGBRG10_1X10 - Selected CFE format: 640x480-PC1g
degirum.exceptions.DegirumException: Model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1' inference failed: [CRITICAL]Operation failed
HailoRT Runtime Agent: Async inference did not complete successfully within the timeout, status = HAILO_TIMEOUT.
hailo_runtime_agent.cpp: 585 [DG::HailoRuntimeAgentImpl::Forward]
When running model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/conorroche/hailo-rpi5-examples/digirum_rec.py", line 119, in <module>
for face, face_embedding in zip(result.results, face_rec_model.predict_batch(aligned_faces)):
File "/home/conorroche/hailo-rpi5-examples/venv_hailo_rpi5_examples/lib/python3.11/site-packages/degirum/model.py", line 289, in predict_batch
for res in self._predict_impl(source):
File "/home/conorroche/hailo-rpi5-examples/venv_hailo_rpi5_examples/lib/python3.11/site-packages/degirum/model.py", line 1206, in _predict_impl
raise DegirumException(
degirum.exceptions.DegirumException: Failed to perform model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1' inference: Model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1' inference failed: [CRITICAL]Operation failed
HailoRT Runtime Agent: Async inference did not complete successfully within the timeout, status = HAILO_TIMEOUT.
hailo_runtime_agent.cpp: 585 [DG::HailoRuntimeAgentImpl::Forward]
When running model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1'
[HailoRT] [critical] Executing pipeline terminate failed with status HAILO_RPC_FAILED(77)
Hailort.log:
[2025-05-21 17:48:04.463] [10027] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:04.465] [10027] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.466] [10027] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.536] [10027] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:04.538] [10027] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.538] [10027] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.579] [10027] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:04.580] [10027] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.581] [10027] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.912] [10043] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:04.914] [10043] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.915] [10043] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.940] [10043] [HailoRT] [info] [vdevice.cpp:523] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: true
[2025-05-21 17:48:05.956] [10043] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:05.958] [10043] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:05.969] [10043] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: scrfd_2_5g
[2025-05-21 17:48:05.969] [10043] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: scrfd_2_5g
[2025-05-21 17:48:06.002] [10043] [HailoRT] [info] [infer_model.cpp:436] [configure] Configuring network group 'scrfd_2_5g' with params: batch size: 0, power mode: ULTRA_PERFORMANCE, latency: NONE
[2025-05-21 17:48:06.004] [10043] [HailoRT] [info] [multi_io_elements.cpp:756] [create] Created (AsyncHwEl)
[2025-05-21 17:48:06.004] [10043] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (EntryPushQEl0scrfd_2_5g/input_layer1 | timeout: 10s)
[2025-05-21 17:48:06.004] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl2AsyncHwEl)
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl3AsyncHwEl | timeout: 10s)
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl3AsyncHwEl | Reorder - src_order: NHCW, src_shape: (20, 20, 2), dst_order: NHWC, dst_shape: (20, 20, 2))
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl3AsyncHwEl)
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl8AsyncHwEl)
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl5AsyncHwEl | timeout: 10s)
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl5AsyncHwEl | Reorder - src_order: FCR, src_shape: (80, 80, 24), dst_order: FCR, dst_shape: (80, 80, 20))
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl5AsyncHwEl)
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl6AsyncHwEl | timeout: 10s)
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl6AsyncHwEl | Reorder - src_order: FCR, src_shape: (20, 20, 24), dst_order: FCR, dst_shape: (20, 20, 20))
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl6AsyncHwEl)
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl7AsyncHwEl)
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl4AsyncHwEl)
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl0AsyncHwEl | timeout: 10s)
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl0AsyncHwEl | Reorder - src_order: FCR, src_shape: (40, 40, 24), dst_order: FCR, dst_shape: (40, 40, 20))
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl0AsyncHwEl)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl1AsyncHwEl | timeout: 10s)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl1AsyncHwEl | Reorder - src_order: NHCW, src_shape: (20, 20, 8), dst_order: NHWC, dst_shape: (20, 20, 8))
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl1AsyncHwEl)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] EntryPushQEl0scrfd_2_5g/input_layer1 | inputs: user | outputs: AsyncHwEl(running in thread_id: 10096)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] AsyncHwEl | inputs: EntryPushQEl0scrfd_2_5g/input_layer1[0] | outputs: PushQEl0AsyncHwEl PushQEl1AsyncHwEl LastAsyncEl2AsyncHwEl PushQEl3AsyncHwEl LastAsyncEl4AsyncHwEl PushQEl5AsyncHwEl PushQEl6AsyncHwEl LastAsyncEl7AsyncHwEl LastAsyncEl8AsyncHwEl
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl0AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl0AsyncHwEl(running in thread_id: 10100)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl0AsyncHwEl | inputs: PushQEl0AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl0AsyncHwEl
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl0AsyncHwEl | inputs: PostInferEl0AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl1AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl1AsyncHwEl(running in thread_id: 10101)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl1AsyncHwEl | inputs: PushQEl1AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl1AsyncHwEl
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl1AsyncHwEl | inputs: PostInferEl1AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl2AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl3AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl3AsyncHwEl(running in thread_id: 10097)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl3AsyncHwEl | inputs: PushQEl3AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl3AsyncHwEl
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl3AsyncHwEl | inputs: PostInferEl3AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl4AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl5AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl5AsyncHwEl(running in thread_id: 10098)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl5AsyncHwEl | inputs: PushQEl5AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl5AsyncHwEl
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl5AsyncHwEl | inputs: PostInferEl5AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl6AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl6AsyncHwEl(running in thread_id: 10099)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl6AsyncHwEl | inputs: PushQEl6AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl6AsyncHwEl
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl6AsyncHwEl | inputs: PostInferEl6AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl7AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl8AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: scrfd_2_5g
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: scrfd_2_5g
[2025-05-21 17:48:09.738] [10042] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:09.739] [10042] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:09.740] [10042] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:09.760] [10042] [HailoRT] [info] [vdevice.cpp:523] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: true
[2025-05-21 17:48:10.762] [10042] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:10.763] [10042] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:10.816] [10042] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-21 17:48:10.816] [10042] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-21 17:48:10.841] [10042] [HailoRT] [info] [infer_model.cpp:436] [configure] Configuring network group 'arcface_mobilefacenet' with params: batch size: 0, power mode: ULTRA_PERFORMANCE, latency: NONE
[2025-05-21 17:48:10.842] [10042] [HailoRT] [info] [multi_io_elements.cpp:756] [create] Created (AsyncHwEl)
[2025-05-21 17:48:10.842] [10042] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (EntryPushQEl0arcface_mobilefacenet/input_layer1 | timeout: 10s)
[2025-05-21 17:48:10.843] [10042] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0AsyncHwEl)
[2025-05-21 17:48:10.843] [10042] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] EntryPushQEl0arcface_mobilefacenet/input_layer1 | inputs: user | outputs: AsyncHwEl(running in thread_id: 10350)
[2025-05-21 17:48:10.843] [10042] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] AsyncHwEl | inputs: EntryPushQEl0arcface_mobilefacenet/input_layer1[0] | outputs: LastAsyncEl0AsyncHwEl
[2025-05-21 17:48:10.843] [10042] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:10.843] [10042] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-21 17:48:10.843] [10042] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-21 17:48:34.760] [10042] [HailoRT] [error] [infer_model.cpp:1048] [wait] CHECK failed - Waiting for async job to finish has failed with timeout (5000ms)
[2025-05-21 17:48:34.760] [10042] [HailoRT] [error] [infer_model.cpp:1021] [wait] CHECK_SUCCESS failed with status=HAILO_TIMEOUT(4)
[2025-05-21 17:48:34.762] [10043] [HailoRT] [error] [infer_model.cpp:1048] [wait] CHECK failed - Waiting for async job to finish has failed with timeout (5000ms)
[2025-05-21 17:48:34.762] [10043] [HailoRT] [error] [infer_model.cpp:1021] [wait] CHECK_SUCCESS failed with status=HAILO_TIMEOUT(4)
[2025-05-21 17:48:35.263] [10027] [HailoRT] [info] [async_infer_runner.cpp:86] [shutdown] Pipeline was aborted. Shutting it down
[2025-05-21 17:48:38.529] [10348] [HailoRT] [error] [network_group_client.cpp:664] [operator()] Infer request callback failed with status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)
[2025-05-21 17:48:38.529] [10348] [HailoRT] [info] [vdevice.cpp:424] [listener_run_in_thread] Shutdown event was signaled in listener_run_in_thread
[2025-05-21 17:48:38.529] [10094] [HailoRT] [error] [network_group_client.cpp:664] [operator()] Infer request callback failed with status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)
[2025-05-21 17:48:38.529] [10094] [HailoRT] [info] [vdevice.cpp:424] [listener_run_in_thread] Shutdown event was signaled in listener_run_in_thread
[2025-05-21 17:48:40.263] [10350] [HailoRT] [error] [hailort_rpc_client.cpp:1552] [ConfiguredNetworkGroup_infer_async] CHECK_GRPC_STATUS failed with error code: 4.
[2025-05-21 17:48:40.263] [10350] [HailoRT] [warning] [hailort_rpc_client.cpp:1552] [ConfiguredNetworkGroup_infer_async] Make sure HailoRT service is enabled and active!
[2025-05-21 17:48:40.263] [10350] [HailoRT] [error] [network_group_client.cpp:697] [infer_async] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-21 17:48:40.263] [10350] [HailoRT] [error] [pipeline_internal.cpp:26] [handle_non_recoverable_async_error] Non-recoverable Async Infer Pipeline error. status error code: HAILO_RPC_FAILED(77)
[2025-05-21 17:48:40.263] [10350] [HailoRT] [error] [async_infer_runner.cpp:88] [shutdown] Shutting down the pipeline with status HAILO_RPC_FAILED(77)
[2025-05-21 17:48:45.764] [10027] [HailoRT] [error] [hailort_rpc_client.cpp:647] [ConfiguredNetworkGroup_shutdown] CHECK_GRPC_STATUS failed with error code: 4.
[2025-05-21 17:48:45.770] [10027] [HailoRT] [warning] [hailort_rpc_client.cpp:647] [ConfiguredNetworkGroup_shutdown] Make sure HailoRT service is enabled and active!
[2025-05-21 17:48:45.770] [10027] [HailoRT] [error] [network_group_client.cpp:258] [shutdown] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77) - Failed to shutdown
[2025-05-21 17:48:45.770] [10027] [HailoRT] [error] [multi_io_elements.cpp:1032] [execute_terminate] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-21 17:48:45.770] [10027] [HailoRT] [error] [pipeline.cpp:1034] [execute] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-21 17:48:45.770] [10027] [HailoRT] [error] [queue_elements.cpp:599] [execute_terminate] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-21 17:48:45.770] [10027] [HailoRT] [critical] [async_infer_runner.cpp:99] [shutdown] Executing pipeline terminate failed with status HAILO_RPC_FAILED(77)
System details:
(venv_hailo_rpi5_examples) conorroche@picam1:~/hailo-rpi5-examples $ sudo systemctl status hailort.service
ā hailort.service - HailoRT service
Loaded: loaded (/lib/systemd/system/hailort.service; enabled; preset: enabled)
Active: active (running) since Wed 2025-05-21 17:36:53 IST; 2min 16s ago
Docs: https://github.com/hailo-ai/hailort
Process: 635 ExecStart=/usr/local/bin/hailort_service (code=exited, status=0/SUCCESS)
Process: 775 ExecStartPost=/bin/sleep 0.1 (code=exited, status=0/SUCCESS)
Main PID: 774 (hailort_service)
Tasks: 11 (limit: 9573)
CPU: 15ms
CGroup: /system.slice/hailort.service
āā774 /usr/local/bin/hailort_service
May 21 17:36:52 picam1 systemd[1]: Starting hailort.service - HailoRT service...
May 21 17:36:53 picam1 systemd[1]: Started hailort.service - HailoRT service.
(venv_hailo_rpi5_examples) conorroche@picam1:~/hailo-rpi5-examples $ hailortcli fw-control identify --extended
Executing on device: 0001:01:00.0
Identifying board
Control Protocol Version: 2
Firmware Version: 4.20.0 (release,app,extended context switch buffer)
Logger Version: 0
Board Name: Hailo-8
Device Architecture: HAILO8
Serial Number: <N/A>
Part Number: <N/A>
Product Name: <N/A>
Boot source: PCIE
Neural Network Core Clock Rate: 400MHz
Device supported features: PCIE
LCS: 3
SoC ID: DABED32B6B5B560CD89CF0A7043205C642004833161516C5E403C812FA82CE5C
ULT ID: 0060C1BB88EC5850FBFAE24C
PM Values: 024601000002A201000002240200008FD13342F872364201
(venv_hailo_rpi5_examples) conorroche@picam1:~/hailo-rpi5-examples $ degirum sys-info
Devices:
HAILORT/HAILO8:
- '@Index': 0
Board Name: Hailo-8
Device Architecture: HAILO8
Firmware Version: 4.20.0
ID: '0001:01:00.0'
Part Number: ''
Product Name: ''
Serial Number: ''
N2X/CPU:
- '@Index': 0
TFLITE/CPU:
- '@Index': 0
- '@Index': 1
Software Version: 0.16.1
Python script:
import degirum as dg
import numpy as np
import cv2
import time
import logging
from picamera2 import Picamera2
# Define a frame generator: a function that yields frames from the Picamera2
def frame_generator():
picam2 = Picamera2()
# Configure the camera (optional: set the resolution or other settings)
picam2.configure(picam2.create_preview_configuration({'format': 'RGB888'}))
# Start the camera
picam2.start()
try:
while True:
# Capture a frame as a numpy array
frame = picam2.capture_array()
# Yield the frame
yield frame
finally:
picam2.stop() # Stop the camera when the generator is closed
def align_and_crop(img, landmarks, image_size=112):
"""
Align and crop the face from the image based on the given landmarks.
Args:
img (np.ndarray): The full image (not the cropped bounding box). This image will be transformed.
landmarks (List[np.ndarray]): List of 5 keypoints (landmarks) as (x, y) coordinates. These keypoints typically include the eyes, nose, and mouth.
image_size (int, optional): The size to which the image should be resized. Defaults to 112. It is typically either 112 or 128 for face recognition models.
Returns:
Tuple[np.ndarray, np.ndarray]: The aligned face image and the transformation matrix.
"""
# Define the reference keypoints used in ArcFace model, based on a typical facial landmark set.
_arcface_ref_kps = np.array(
[
[38.2946, 51.6963], # Left eye
[73.5318, 51.5014], # Right eye
[56.0252, 71.7366], # Nose
[41.5493, 92.3655], # Left mouth corner
[70.7299, 92.2041], # Right mouth corner
],
dtype=np.float32,
)
# Ensure the input landmarks have exactly 5 points (as expected for face alignment)
assert len(landmarks) == 5
# Validate that image_size is divisible by either 112 or 128 (common image sizes for face recognition models)
assert image_size % 112 == 0 or image_size % 128 == 0
# Adjust the scaling factor (ratio) based on the desired image size (112 or 128)
if image_size % 112 == 0:
ratio = float(image_size) / 112.0
diff_x = 0 # No horizontal shift for 112 scaling
else:
ratio = float(image_size) / 128.0
diff_x = 8.0 * ratio # Horizontal shift for 128 scaling
# Apply the scaling and shifting to the reference keypoints
dst = _arcface_ref_kps * ratio
dst[:, 0] += diff_x # Apply the horizontal shift
# Estimate the similarity transformation matrix to align the landmarks with the reference keypoints
M, inliers = cv2.estimateAffinePartial2D(np.array(landmarks), dst, ransacReprojThreshold=1000)
assert np.all(inliers == True)
# Apply the affine transformation to the input image to align the face
aligned_img = cv2.warpAffine(img, M, (image_size, image_size), borderValue=0.0)
return aligned_img, M
# Specify the model name
face_det_model_name = "scrfd_2.5g--640x640_quant_hailort_hailo8_1"
face_rec_model_name = "arcface_mobilefacenet--112x112_quant_hailort_hailo8_1"
# Specify the inference host address
#inference_host_address = "@cloud" # Use "@cloud" for cloud inference
inference_host_address = "@local" # Use "@local" for local inference
# Specify the zoo_url
#zoo_url = "degirum/models_hailort"
zoo_url = "/home/conorroche/models" # For local model files
token = '' # Leave empty for local inference
# Load the face detection model
face_det_model = dg.load_model(
model_name=face_det_model_name,
inference_host_address=inference_host_address,
zoo_url=zoo_url,
token=token,
overlay_color=(0, 255, 0) # Green color for bounding boxes
)
face_rec_model = dg.load_model(
model_name=face_rec_model_name,
inference_host_address=inference_host_address,
zoo_url=zoo_url,
token=token,
overlay_color=(0, 255, 0) # Green color for bounding boxes
)
for result in face_det_model.predict_batch(frame_generator()):
aligned_faces = []
if result.results:
for face in result.results:
landmarks = [landmark["landmark"] for landmark in face["landmarks"]]
aligned_face, _ = align_and_crop(result.image, landmarks) # Align and crop face
aligned_faces.append(aligned_face)
for face, face_embedding in zip(result.results, face_rec_model.predict_batch(aligned_faces)):
embedding = face_embedding.results[0]["data"][0] # Extract embedding
cv2.imshow("AI Inference", result.image_overlay)
# Process GUI events and break the loop if 'q' key was pressed
if cv2.waitKey(1) & 0xFF == ord("q"):
break
# Destroy any remaining OpenCV windows after the loop finishes
cv2.destroyAllWindows()