A Comprehensive Guide to Building a Face Recognition System

@shashi

I have tried again today and it is back being unstable even with 16.1 of degirum, it hangs once a face is detected by the camera and embeddings are generated.. Not sure if this is a software or potential hardware issue at this point.

If you have any other suggestions please let me know.

If anyone else has a HAILO8 + rpi5 with hailort 4.20 and could try the below script let me know if it runs ok.

Details are as follows:

Python script output:

(venv_hailo_rpi5_examples) conorroche@picam1:~/hailo-rpi5-examples $ python digirum_rec.py 
[0:11:14.441243921] [10027]  INFO Camera camera_manager.cpp:326 libcamera v0.5.0+59-d83ff0a4
[0:11:14.448270206] [10056]  INFO RPI pisp.cpp:720 libpisp version v1.2.1 981977ff21f3 29-04-2025 (14:13:50)
[0:11:14.457470400] [10056]  INFO RPI pisp.cpp:1179 Registered camera /base/axi/pcie@1000120000/rp1/i2c@88000/ov5647@36 to CFE device /dev/media0 and ISP device /dev/media2 using PiSP variant BCM2712_D0
[0:11:14.461936923] [10027]  INFO Camera camera.cpp:1205 configuring streams: (0) 640x480-RGB888 (1) 640x480-GBRG_PISP_COMP1
[0:11:14.462070720] [10056]  INFO RPI pisp.cpp:1483 Sensor: /base/axi/pcie@1000120000/rp1/i2c@88000/ov5647@36 - Selected sensor format: 640x480-SGBRG10_1X10 - Selected CFE format: 640x480-PC1g
degirum.exceptions.DegirumException: Model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1' inference failed: [CRITICAL]Operation failed
HailoRT Runtime Agent: Async inference did not complete successfully within the timeout, status = HAILO_TIMEOUT.
hailo_runtime_agent.cpp: 585 [DG::HailoRuntimeAgentImpl::Forward]
When running model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/conorroche/hailo-rpi5-examples/digirum_rec.py", line 119, in <module>
    for face, face_embedding in zip(result.results, face_rec_model.predict_batch(aligned_faces)):
  File "/home/conorroche/hailo-rpi5-examples/venv_hailo_rpi5_examples/lib/python3.11/site-packages/degirum/model.py", line 289, in predict_batch
    for res in self._predict_impl(source):
  File "/home/conorroche/hailo-rpi5-examples/venv_hailo_rpi5_examples/lib/python3.11/site-packages/degirum/model.py", line 1206, in _predict_impl
    raise DegirumException(
degirum.exceptions.DegirumException: Failed to perform model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1' inference: Model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1' inference failed: [CRITICAL]Operation failed
HailoRT Runtime Agent: Async inference did not complete successfully within the timeout, status = HAILO_TIMEOUT.
hailo_runtime_agent.cpp: 585 [DG::HailoRuntimeAgentImpl::Forward]
When running model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1'
[HailoRT] [critical] Executing pipeline terminate failed with status HAILO_RPC_FAILED(77)

Hailort.log:

[2025-05-21 17:48:04.463] [10027] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:04.465] [10027] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.466] [10027] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.536] [10027] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:04.538] [10027] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.538] [10027] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.579] [10027] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:04.580] [10027] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.581] [10027] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.912] [10043] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:04.914] [10043] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.915] [10043] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:04.940] [10043] [HailoRT] [info] [vdevice.cpp:523] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: true
[2025-05-21 17:48:05.956] [10043] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:05.958] [10043] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:05.969] [10043] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: scrfd_2_5g
[2025-05-21 17:48:05.969] [10043] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: scrfd_2_5g
[2025-05-21 17:48:06.002] [10043] [HailoRT] [info] [infer_model.cpp:436] [configure] Configuring network group 'scrfd_2_5g' with params: batch size: 0, power mode: ULTRA_PERFORMANCE, latency: NONE
[2025-05-21 17:48:06.004] [10043] [HailoRT] [info] [multi_io_elements.cpp:756] [create] Created (AsyncHwEl)
[2025-05-21 17:48:06.004] [10043] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (EntryPushQEl0scrfd_2_5g/input_layer1 | timeout: 10s)
[2025-05-21 17:48:06.004] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl2AsyncHwEl)
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl3AsyncHwEl | timeout: 10s)
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl3AsyncHwEl | Reorder - src_order: NHCW, src_shape: (20, 20, 2), dst_order: NHWC, dst_shape: (20, 20, 2))
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl3AsyncHwEl)
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl8AsyncHwEl)
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl5AsyncHwEl | timeout: 10s)
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl5AsyncHwEl | Reorder - src_order: FCR, src_shape: (80, 80, 24), dst_order: FCR, dst_shape: (80, 80, 20))
[2025-05-21 17:48:06.005] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl5AsyncHwEl)
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl6AsyncHwEl | timeout: 10s)
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl6AsyncHwEl | Reorder - src_order: FCR, src_shape: (20, 20, 24), dst_order: FCR, dst_shape: (20, 20, 20))
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl6AsyncHwEl)
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl7AsyncHwEl)
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl4AsyncHwEl)
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl0AsyncHwEl | timeout: 10s)
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl0AsyncHwEl | Reorder - src_order: FCR, src_shape: (40, 40, 24), dst_order: FCR, dst_shape: (40, 40, 20))
[2025-05-21 17:48:06.006] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl0AsyncHwEl)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl1AsyncHwEl | timeout: 10s)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl1AsyncHwEl | Reorder - src_order: NHCW, src_shape: (20, 20, 8), dst_order: NHWC, dst_shape: (20, 20, 8))
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl1AsyncHwEl)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] EntryPushQEl0scrfd_2_5g/input_layer1 | inputs: user | outputs: AsyncHwEl(running in thread_id: 10096)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] AsyncHwEl | inputs: EntryPushQEl0scrfd_2_5g/input_layer1[0] | outputs: PushQEl0AsyncHwEl PushQEl1AsyncHwEl LastAsyncEl2AsyncHwEl PushQEl3AsyncHwEl LastAsyncEl4AsyncHwEl PushQEl5AsyncHwEl PushQEl6AsyncHwEl LastAsyncEl7AsyncHwEl LastAsyncEl8AsyncHwEl
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl0AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl0AsyncHwEl(running in thread_id: 10100)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl0AsyncHwEl | inputs: PushQEl0AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl0AsyncHwEl
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl0AsyncHwEl | inputs: PostInferEl0AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl1AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl1AsyncHwEl(running in thread_id: 10101)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl1AsyncHwEl | inputs: PushQEl1AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl1AsyncHwEl
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl1AsyncHwEl | inputs: PostInferEl1AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl2AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl3AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl3AsyncHwEl(running in thread_id: 10097)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl3AsyncHwEl | inputs: PushQEl3AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl3AsyncHwEl
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl3AsyncHwEl | inputs: PostInferEl3AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl4AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl5AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl5AsyncHwEl(running in thread_id: 10098)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl5AsyncHwEl | inputs: PushQEl5AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl5AsyncHwEl
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl5AsyncHwEl | inputs: PostInferEl5AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl6AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl6AsyncHwEl(running in thread_id: 10099)
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl6AsyncHwEl | inputs: PushQEl6AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl6AsyncHwEl
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl6AsyncHwEl | inputs: PostInferEl6AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl7AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl8AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: scrfd_2_5g
[2025-05-21 17:48:06.007] [10043] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: scrfd_2_5g
[2025-05-21 17:48:09.738] [10042] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:09.739] [10042] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:09.740] [10042] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:09.760] [10042] [HailoRT] [info] [vdevice.cpp:523] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: true
[2025-05-21 17:48:10.762] [10042] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-21 17:48:10.763] [10042] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-21 17:48:10.816] [10042] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-21 17:48:10.816] [10042] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-21 17:48:10.841] [10042] [HailoRT] [info] [infer_model.cpp:436] [configure] Configuring network group 'arcface_mobilefacenet' with params: batch size: 0, power mode: ULTRA_PERFORMANCE, latency: NONE
[2025-05-21 17:48:10.842] [10042] [HailoRT] [info] [multi_io_elements.cpp:756] [create] Created (AsyncHwEl)
[2025-05-21 17:48:10.842] [10042] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (EntryPushQEl0arcface_mobilefacenet/input_layer1 | timeout: 10s)
[2025-05-21 17:48:10.843] [10042] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0AsyncHwEl)
[2025-05-21 17:48:10.843] [10042] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] EntryPushQEl0arcface_mobilefacenet/input_layer1 | inputs: user | outputs: AsyncHwEl(running in thread_id: 10350)
[2025-05-21 17:48:10.843] [10042] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] AsyncHwEl | inputs: EntryPushQEl0arcface_mobilefacenet/input_layer1[0] | outputs: LastAsyncEl0AsyncHwEl
[2025-05-21 17:48:10.843] [10042] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-21 17:48:10.843] [10042] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-21 17:48:10.843] [10042] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-21 17:48:34.760] [10042] [HailoRT] [error] [infer_model.cpp:1048] [wait] CHECK failed - Waiting for async job to finish has failed with timeout (5000ms)
[2025-05-21 17:48:34.760] [10042] [HailoRT] [error] [infer_model.cpp:1021] [wait] CHECK_SUCCESS failed with status=HAILO_TIMEOUT(4)
[2025-05-21 17:48:34.762] [10043] [HailoRT] [error] [infer_model.cpp:1048] [wait] CHECK failed - Waiting for async job to finish has failed with timeout (5000ms)
[2025-05-21 17:48:34.762] [10043] [HailoRT] [error] [infer_model.cpp:1021] [wait] CHECK_SUCCESS failed with status=HAILO_TIMEOUT(4)
[2025-05-21 17:48:35.263] [10027] [HailoRT] [info] [async_infer_runner.cpp:86] [shutdown] Pipeline was aborted. Shutting it down
[2025-05-21 17:48:38.529] [10348] [HailoRT] [error] [network_group_client.cpp:664] [operator()] Infer request callback failed with status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)
[2025-05-21 17:48:38.529] [10348] [HailoRT] [info] [vdevice.cpp:424] [listener_run_in_thread] Shutdown event was signaled in listener_run_in_thread
[2025-05-21 17:48:38.529] [10094] [HailoRT] [error] [network_group_client.cpp:664] [operator()] Infer request callback failed with status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)
[2025-05-21 17:48:38.529] [10094] [HailoRT] [info] [vdevice.cpp:424] [listener_run_in_thread] Shutdown event was signaled in listener_run_in_thread
[2025-05-21 17:48:40.263] [10350] [HailoRT] [error] [hailort_rpc_client.cpp:1552] [ConfiguredNetworkGroup_infer_async] CHECK_GRPC_STATUS failed with error code: 4.
[2025-05-21 17:48:40.263] [10350] [HailoRT] [warning] [hailort_rpc_client.cpp:1552] [ConfiguredNetworkGroup_infer_async] Make sure HailoRT service is enabled and active!
[2025-05-21 17:48:40.263] [10350] [HailoRT] [error] [network_group_client.cpp:697] [infer_async] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-21 17:48:40.263] [10350] [HailoRT] [error] [pipeline_internal.cpp:26] [handle_non_recoverable_async_error] Non-recoverable Async Infer Pipeline error. status error code: HAILO_RPC_FAILED(77)
[2025-05-21 17:48:40.263] [10350] [HailoRT] [error] [async_infer_runner.cpp:88] [shutdown] Shutting down the pipeline with status HAILO_RPC_FAILED(77)
[2025-05-21 17:48:45.764] [10027] [HailoRT] [error] [hailort_rpc_client.cpp:647] [ConfiguredNetworkGroup_shutdown] CHECK_GRPC_STATUS failed with error code: 4.
[2025-05-21 17:48:45.770] [10027] [HailoRT] [warning] [hailort_rpc_client.cpp:647] [ConfiguredNetworkGroup_shutdown] Make sure HailoRT service is enabled and active!
[2025-05-21 17:48:45.770] [10027] [HailoRT] [error] [network_group_client.cpp:258] [shutdown] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77) - Failed to shutdown
[2025-05-21 17:48:45.770] [10027] [HailoRT] [error] [multi_io_elements.cpp:1032] [execute_terminate] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-21 17:48:45.770] [10027] [HailoRT] [error] [pipeline.cpp:1034] [execute] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-21 17:48:45.770] [10027] [HailoRT] [error] [queue_elements.cpp:599] [execute_terminate] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-21 17:48:45.770] [10027] [HailoRT] [critical] [async_infer_runner.cpp:99] [shutdown] Executing pipeline terminate failed with status HAILO_RPC_FAILED(77)

System details:

(venv_hailo_rpi5_examples) conorroche@picam1:~/hailo-rpi5-examples $ sudo systemctl status hailort.service
ā— hailort.service - HailoRT service
     Loaded: loaded (/lib/systemd/system/hailort.service; enabled; preset: enabled)
     Active: active (running) since Wed 2025-05-21 17:36:53 IST; 2min 16s ago
       Docs: https://github.com/hailo-ai/hailort
    Process: 635 ExecStart=/usr/local/bin/hailort_service (code=exited, status=0/SUCCESS)
    Process: 775 ExecStartPost=/bin/sleep 0.1 (code=exited, status=0/SUCCESS)
   Main PID: 774 (hailort_service)
      Tasks: 11 (limit: 9573)
        CPU: 15ms
     CGroup: /system.slice/hailort.service
             └─774 /usr/local/bin/hailort_service

May 21 17:36:52 picam1 systemd[1]: Starting hailort.service - HailoRT service...
May 21 17:36:53 picam1 systemd[1]: Started hailort.service - HailoRT service.
(venv_hailo_rpi5_examples) conorroche@picam1:~/hailo-rpi5-examples $ hailortcli fw-control identify --extended
Executing on device: 0001:01:00.0
Identifying board
Control Protocol Version: 2
Firmware Version: 4.20.0 (release,app,extended context switch buffer)
Logger Version: 0
Board Name: Hailo-8
Device Architecture: HAILO8
Serial Number: <N/A>
Part Number: <N/A>
Product Name: <N/A>
Boot source: PCIE
Neural Network Core Clock Rate: 400MHz
Device supported features: PCIE
LCS: 3
SoC ID: DABED32B6B5B560CD89CF0A7043205C642004833161516C5E403C812FA82CE5C
ULT ID: 0060C1BB88EC5850FBFAE24C
PM Values: 024601000002A201000002240200008FD13342F872364201

(venv_hailo_rpi5_examples) conorroche@picam1:~/hailo-rpi5-examples $ degirum sys-info
Devices:
  HAILORT/HAILO8:
  - '@Index': 0
    Board Name: Hailo-8
    Device Architecture: HAILO8
    Firmware Version: 4.20.0
    ID: '0001:01:00.0'
    Part Number: ''
    Product Name: ''
    Serial Number: ''
  N2X/CPU:
  - '@Index': 0
  TFLITE/CPU:
  - '@Index': 0
  - '@Index': 1
Software Version: 0.16.1

Python script:

import degirum as dg
import numpy as np
import cv2
import time
import logging
from picamera2 import Picamera2

# Define a frame generator: a function that yields frames from the Picamera2
def frame_generator():
    picam2 = Picamera2()

    # Configure the camera (optional: set the resolution or other settings)
    picam2.configure(picam2.create_preview_configuration({'format': 'RGB888'}))

    # Start the camera
    picam2.start()

    try:
        while True:
            # Capture a frame as a numpy array
            frame = picam2.capture_array()


            # Yield the frame
            yield frame
    finally:
        picam2.stop()  # Stop the camera when the generator is closed

def align_and_crop(img, landmarks, image_size=112):
    """
    Align and crop the face from the image based on the given landmarks.

    Args:
        img (np.ndarray): The full image (not the cropped bounding box). This image will be transformed.
        landmarks (List[np.ndarray]): List of 5 keypoints (landmarks) as (x, y) coordinates. These keypoints typically include the eyes, nose, and mouth.
        image_size (int, optional): The size to which the image should be resized. Defaults to 112. It is typically either 112 or 128 for face recognition models.

    Returns:
        Tuple[np.ndarray, np.ndarray]: The aligned face image and the transformation matrix.
    """
    # Define the reference keypoints used in ArcFace model, based on a typical facial landmark set.
    _arcface_ref_kps = np.array(
        [
            [38.2946, 51.6963],  # Left eye
            [73.5318, 51.5014],  # Right eye
            [56.0252, 71.7366],  # Nose
            [41.5493, 92.3655],  # Left mouth corner
            [70.7299, 92.2041],  # Right mouth corner
        ],
        dtype=np.float32,
    )

    # Ensure the input landmarks have exactly 5 points (as expected for face alignment)
    assert len(landmarks) == 5

    # Validate that image_size is divisible by either 112 or 128 (common image sizes for face recognition models)
    assert image_size % 112 == 0 or image_size % 128 == 0

    # Adjust the scaling factor (ratio) based on the desired image size (112 or 128)
    if image_size % 112 == 0:
        ratio = float(image_size) / 112.0
        diff_x = 0  # No horizontal shift for 112 scaling
    else:
        ratio = float(image_size) / 128.0
        diff_x = 8.0 * ratio  # Horizontal shift for 128 scaling

    # Apply the scaling and shifting to the reference keypoints
    dst = _arcface_ref_kps * ratio
    dst[:, 0] += diff_x  # Apply the horizontal shift

    # Estimate the similarity transformation matrix to align the landmarks with the reference keypoints
    M, inliers = cv2.estimateAffinePartial2D(np.array(landmarks), dst, ransacReprojThreshold=1000)
    assert np.all(inliers == True)

    # Apply the affine transformation to the input image to align the face
    aligned_img = cv2.warpAffine(img, M, (image_size, image_size), borderValue=0.0)

    return aligned_img, M


# Specify the model name 
face_det_model_name = "scrfd_2.5g--640x640_quant_hailort_hailo8_1"
face_rec_model_name = "arcface_mobilefacenet--112x112_quant_hailort_hailo8_1"

# Specify the inference host address
#inference_host_address = "@cloud"  # Use "@cloud" for cloud inference
inference_host_address = "@local"  # Use "@local" for local inference

# Specify the zoo_url
#zoo_url = "degirum/models_hailort"
zoo_url = "/home/conorroche/models"  # For local model files

token = ''  # Leave empty for local inference

# Load the face detection model
face_det_model = dg.load_model(
    model_name=face_det_model_name,
    inference_host_address=inference_host_address,
    zoo_url=zoo_url,
    token=token, 
    overlay_color=(0, 255, 0)  # Green color for bounding boxes
)

face_rec_model = dg.load_model(
    model_name=face_rec_model_name,
    inference_host_address=inference_host_address,
    zoo_url=zoo_url,
    token=token,
    overlay_color=(0, 255, 0)  # Green color for bounding boxes
)

for result in face_det_model.predict_batch(frame_generator()):
    aligned_faces = []
    if result.results:
        for face in result.results:
            landmarks = [landmark["landmark"] for landmark in face["landmarks"]]
            aligned_face, _ = align_and_crop(result.image, landmarks)  # Align and crop face
            aligned_faces.append(aligned_face)
        for face, face_embedding in zip(result.results, face_rec_model.predict_batch(aligned_faces)):
            embedding = face_embedding.results[0]["data"][0]  # Extract embedding
    
    cv2.imshow("AI Inference", result.image_overlay)
   
     # Process GUI events and break the loop if 'q' key was pressed
    if cv2.waitKey(1) & 0xFF == ord("q"):
        break

# Destroy any remaining OpenCV windows after the loop finishes
cv2.destroyAllWindows()

Hi @Conor_Roche,

Analyzing your logs.
By Hailort.log, the first error happens at 17:48:34.
The Hailo service started 17:36:53, so the error did not cause service crash/restart.

Questions/suggestions:

  1. After such error, simple object detection script (which used to work in your case) still works or it also fails (i.e. error is recoverable or not)?
  2. PySDK can run local inferences even without the Hailo service. Can you try this: reboot system (just in case), disable service (sudo systemctl disable hailort.service) and run face detection script. If it will run OK, then the problem is with Hailo service RPC communications, if it also fails, the problem is with Hailo runtime itself.

We will try to reproduce your problem on our side as well.

Hi @Vlad_Klimov

Thank you for replying. It is the python script that hangs and the cv window freezes, to recover I have to either reboot or stop the hailort service which releases something and allows the python process to be stopped. Ps hangs too and strace ps shows:

openat(AT_FDCWD, "/proc/2723/ctty", O_RDONLY) = -1 ENOENT (No such file or directory)
newfstatat(AT_FDCWD, "/proc/2750", {st_mode=S_IFDIR|0555, st_size=0, ...}, 0) = 0
openat(AT_FDCWD, "/proc/2750/stat", O_RDONLY) = 4
read(4, "2750 (hailort_service) S 1 2750 "..., 2048) = 187
close(4)                                = 0
openat(AT_FDCWD, "/proc/2750/status", O_RDONLY) = 4
read(4, "Name:\thailort_service\nUmask:\t002"..., 2048) = 1104
close(4)                                = 0
openat(AT_FDCWD, "/proc/2750/environ", O_RDONLY) = -1 EACCES (Permission denied)
openat(AT_FDCWD, "/proc/2750/cmdline", O_RDONLY) = 4
read(4, 

Another example of logs for this:

[0:02:25.905369115] [2075]  INFO Camera camera_manager.cpp:326 libcamera v0.5.0+59-d83ff0a4
[0:02:25.912431293] [2104]  INFO RPI pisp.cpp:720 libpisp version v1.2.1 981977ff21f3 29-04-2025 (14:13:50)
[0:02:25.921862184] [2104]  INFO RPI pisp.cpp:1179 Registered camera /base/axi/pcie@1000120000/rp1/i2c@88000/ov5647@36 to CFE device /dev/media2 and ISP device /dev/media0 using PiSP variant BCM2712_D0
[0:02:25.926601609] [2075]  INFO Camera camera.cpp:1205 configuring streams: (0) 640x480-RGB888 (1) 640x480-GBRG_PISP_COMP1
[0:02:25.926735043] [2104]  INFO RPI pisp.cpp:1483 Sensor: /base/axi/pcie@1000120000/rp1/i2c@88000/ov5647@36 - Selected sensor format: 640x480-SGBRG10_1X10 - Selected CFE format: 640x480-PC1g
degirum.exceptions.DegirumException: Model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1' inference failed: [CRITICAL]Operation failed
HailoRT Runtime Agent: Async inference did not complete successfully within the timeout, status = HAILO_TIMEOUT.
hailo_runtime_agent.cpp: 585 [DG::HailoRuntimeAgentImpl::Forward]
When running model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/conorroche/hailo-rpi5-examples/digirum_rec.py", line 119, in <module>
    for face, face_embedding in zip(result.results, face_rec_model.predict_batch(aligned_faces)):
  File "/home/conorroche/hailo-rpi5-examples/venv_hailo_rpi5_examples/lib/python3.11/site-packages/degirum/model.py", line 289, in predict_batch
    for res in self._predict_impl(source):
  File "/home/conorroche/hailo-rpi5-examples/venv_hailo_rpi5_examples/lib/python3.11/site-packages/degirum/model.py", line 1206, in _predict_impl
    raise DegirumException(
degirum.exceptions.DegirumException: Failed to perform model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1' inference: Model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1' inference failed: [CRITICAL]Operation failed
HailoRT Runtime Agent: Async inference did not complete successfully within the timeout, status = HAILO_TIMEOUT.
hailo_runtime_agent.cpp: 585 [DG::HailoRuntimeAgentImpl::Forward]
When running model 'arcface_mobilefacenet--112x112_quant_hailort_hailo8_1'
[HailoRT] [critical] Executing pipeline terminate failed with status HAILO_RPC_FAILED(77)
[HailoRT] [critical] Executing pipeline terminate failed with status HAILO_RPC_FAILED(77)
[2025-05-22 09:07:29.058] [2053] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-22 09:07:29.059] [2053] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:07:29.061] [2053] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:07:54.721] [2075] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-22 09:07:54.723] [2075] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:07:54.724] [2075] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:07:54.927] [2075] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-22 09:07:54.928] [2075] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:07:54.929] [2075] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:07:54.985] [2075] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-22 09:07:54.986] [2075] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:07:54.987] [2075] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:07:55.504] [2091] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-22 09:07:55.505] [2091] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:07:55.506] [2091] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:07:55.545] [2091] [HailoRT] [info] [vdevice.cpp:523] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: true
[2025-05-22 09:07:56.610] [2091] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-22 09:07:56.611] [2091] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:07:56.662] [2091] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: scrfd_2_5g
[2025-05-22 09:07:56.662] [2091] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: scrfd_2_5g
[2025-05-22 09:07:56.698] [2091] [HailoRT] [info] [infer_model.cpp:436] [configure] Configuring network group 'scrfd_2_5g' with params: batch size: 0, power mode: ULTRA_PERFORMANCE, latency: NONE
[2025-05-22 09:07:56.701] [2091] [HailoRT] [info] [multi_io_elements.cpp:756] [create] Created (AsyncHwEl)
[2025-05-22 09:07:56.701] [2091] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (EntryPushQEl0scrfd_2_5g/input_layer1 | timeout: 10s)
[2025-05-22 09:07:56.702] [2091] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl2AsyncHwEl)
[2025-05-22 09:07:56.702] [2091] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl3AsyncHwEl | timeout: 10s)
[2025-05-22 09:07:56.702] [2091] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl3AsyncHwEl | Reorder - src_order: NHCW, src_shape: (20, 20, 2), dst_order: NHWC, dst_shape: (20, 20, 2))
[2025-05-22 09:07:56.702] [2091] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl3AsyncHwEl)
[2025-05-22 09:07:56.702] [2091] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl8AsyncHwEl)
[2025-05-22 09:07:56.703] [2091] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl5AsyncHwEl | timeout: 10s)
[2025-05-22 09:07:56.703] [2091] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl5AsyncHwEl | Reorder - src_order: FCR, src_shape: (80, 80, 24), dst_order: FCR, dst_shape: (80, 80, 20))
[2025-05-22 09:07:56.703] [2091] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl5AsyncHwEl)
[2025-05-22 09:07:56.703] [2091] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl6AsyncHwEl | timeout: 10s)
[2025-05-22 09:07:56.703] [2091] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl6AsyncHwEl | Reorder - src_order: FCR, src_shape: (20, 20, 24), dst_order: FCR, dst_shape: (20, 20, 20))
[2025-05-22 09:07:56.703] [2091] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl6AsyncHwEl)
[2025-05-22 09:07:56.704] [2091] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl7AsyncHwEl)
[2025-05-22 09:07:56.704] [2091] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl4AsyncHwEl)
[2025-05-22 09:07:56.704] [2091] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl0AsyncHwEl | timeout: 10s)
[2025-05-22 09:07:56.704] [2091] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl0AsyncHwEl | Reorder - src_order: FCR, src_shape: (40, 40, 24), dst_order: FCR, dst_shape: (40, 40, 20))
[2025-05-22 09:07:56.704] [2091] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl0AsyncHwEl)
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl1AsyncHwEl | timeout: 10s)
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl1AsyncHwEl | Reorder - src_order: NHCW, src_shape: (20, 20, 8), dst_order: NHWC, dst_shape: (20, 20, 8))
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl1AsyncHwEl)
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] EntryPushQEl0scrfd_2_5g/input_layer1 | inputs: user | outputs: AsyncHwEl(running in thread_id: 2145)
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] AsyncHwEl | inputs: EntryPushQEl0scrfd_2_5g/input_layer1[0] | outputs: PushQEl0AsyncHwEl PushQEl1AsyncHwEl LastAsyncEl2AsyncHwEl PushQEl3AsyncHwEl LastAsyncEl4AsyncHwEl PushQEl5AsyncHwEl PushQEl6AsyncHwEl LastAsyncEl7AsyncHwEl LastAsyncEl8AsyncHwEl
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl0AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl0AsyncHwEl(running in thread_id: 2149)
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl0AsyncHwEl | inputs: PushQEl0AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl0AsyncHwEl
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl0AsyncHwEl | inputs: PostInferEl0AsyncHwEl[0] | outputs: user
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl1AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl1AsyncHwEl(running in thread_id: 2150)
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl1AsyncHwEl | inputs: PushQEl1AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl1AsyncHwEl
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl1AsyncHwEl | inputs: PostInferEl1AsyncHwEl[0] | outputs: user
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl2AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl3AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl3AsyncHwEl(running in thread_id: 2146)
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl3AsyncHwEl | inputs: PushQEl3AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl3AsyncHwEl
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl3AsyncHwEl | inputs: PostInferEl3AsyncHwEl[0] | outputs: user
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl4AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl5AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl5AsyncHwEl(running in thread_id: 2147)
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl5AsyncHwEl | inputs: PushQEl5AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl5AsyncHwEl
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl5AsyncHwEl | inputs: PostInferEl5AsyncHwEl[0] | outputs: user
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl6AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl6AsyncHwEl(running in thread_id: 2148)
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl6AsyncHwEl | inputs: PushQEl6AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl6AsyncHwEl
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl6AsyncHwEl | inputs: PostInferEl6AsyncHwEl[0] | outputs: user
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl7AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl8AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: scrfd_2_5g
[2025-05-22 09:07:56.705] [2091] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: scrfd_2_5g
[2025-05-22 09:08:06.776] [2091] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-22 09:08:06.778] [2091] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:08:06.779] [2091] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:08:06.809] [2091] [HailoRT] [info] [vdevice.cpp:523] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: true
[2025-05-22 09:08:07.811] [2091] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-22 09:08:07.813] [2091] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-22 09:08:07.872] [2091] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-22 09:08:07.872] [2091] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-22 09:08:07.907] [2091] [HailoRT] [info] [infer_model.cpp:436] [configure] Configuring network group 'arcface_mobilefacenet' with params: batch size: 0, power mode: ULTRA_PERFORMANCE, latency: NONE
[2025-05-22 09:08:07.908] [2091] [HailoRT] [info] [multi_io_elements.cpp:756] [create] Created (AsyncHwEl)
[2025-05-22 09:08:07.908] [2091] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (EntryPushQEl0arcface_mobilefacenet/input_layer1 | timeout: 10s)
[2025-05-22 09:08:07.909] [2091] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0AsyncHwEl)
[2025-05-22 09:08:07.909] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] EntryPushQEl0arcface_mobilefacenet/input_layer1 | inputs: user | outputs: AsyncHwEl(running in thread_id: 2368)
[2025-05-22 09:08:07.909] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] AsyncHwEl | inputs: EntryPushQEl0arcface_mobilefacenet/input_layer1[0] | outputs: LastAsyncEl0AsyncHwEl
[2025-05-22 09:08:07.909] [2091] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-22 09:08:07.909] [2091] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-22 09:08:07.909] [2091] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-22 09:08:13.505] [2090] [HailoRT] [error] [infer_model.cpp:1048] [wait] CHECK failed - Waiting for async job to finish has failed with timeout (5000ms)
[2025-05-22 09:08:13.505] [2090] [HailoRT] [error] [infer_model.cpp:1021] [wait] CHECK_SUCCESS failed with status=HAILO_TIMEOUT(4)
[2025-05-22 09:08:13.509] [2091] [HailoRT] [error] [infer_model.cpp:1048] [wait] CHECK failed - Waiting for async job to finish has failed with timeout (5000ms)
[2025-05-22 09:08:13.509] [2091] [HailoRT] [error] [infer_model.cpp:1021] [wait] CHECK_SUCCESS failed with status=HAILO_TIMEOUT(4)
[2025-05-22 09:08:14.000] [2075] [HailoRT] [info] [async_infer_runner.cpp:86] [shutdown] Pipeline was aborted. Shutting it down
[2025-05-22 09:08:19.007] [2368] [HailoRT] [error] [hailort_rpc_client.cpp:1552] [ConfiguredNetworkGroup_infer_async] CHECK_GRPC_STATUS failed with error code: 4.
[2025-05-22 09:08:19.007] [2368] [HailoRT] [warning] [hailort_rpc_client.cpp:1552] [ConfiguredNetworkGroup_infer_async] Make sure HailoRT service is enabled and active!
[2025-05-22 09:08:19.007] [2368] [HailoRT] [error] [network_group_client.cpp:697] [infer_async] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-22 09:08:19.007] [2368] [HailoRT] [error] [pipeline_internal.cpp:26] [handle_non_recoverable_async_error] Non-recoverable Async Infer Pipeline error. status error code: HAILO_RPC_FAILED(77)
[2025-05-22 09:08:19.007] [2368] [HailoRT] [error] [async_infer_runner.cpp:88] [shutdown] Shutting down the pipeline with status HAILO_RPC_FAILED(77)
[2025-05-22 09:08:24.500] [2075] [HailoRT] [error] [hailort_rpc_client.cpp:647] [ConfiguredNetworkGroup_shutdown] CHECK_GRPC_STATUS failed with error code: 4.
[2025-05-22 09:08:24.500] [2075] [HailoRT] [warning] [hailort_rpc_client.cpp:647] [ConfiguredNetworkGroup_shutdown] Make sure HailoRT service is enabled and active!
[2025-05-22 09:08:24.500] [2075] [HailoRT] [error] [network_group_client.cpp:258] [shutdown] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77) - Failed to shutdown
[2025-05-22 09:08:24.501] [2075] [HailoRT] [error] [multi_io_elements.cpp:1032] [execute_terminate] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-22 09:08:24.501] [2075] [HailoRT] [error] [pipeline.cpp:1034] [execute] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-22 09:08:24.501] [2075] [HailoRT] [error] [queue_elements.cpp:599] [execute_terminate] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-22 09:08:24.501] [2075] [HailoRT] [critical] [async_infer_runner.cpp:99] [shutdown] Executing pipeline terminate failed with status HAILO_RPC_FAILED(77)
[2025-05-22 09:08:25.117] [2129] [HailoRT] [error] [hailort_rpc_client.cpp:38] [client_keep_alive] CHECK_GRPC_STATUS failed with error code: 4.
[2025-05-22 09:08:25.117] [2129] [HailoRT] [warning] [hailort_rpc_client.cpp:38] [client_keep_alive] Make sure HailoRT service is enabled and active!
[2025-05-22 09:08:25.117] [2129] [HailoRT] [error] [rpc_client_utils.hpp:198] [keep_alive] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-22 09:08:29.508] [2368] [HailoRT] [error] [hailort_rpc_client.cpp:647] [ConfiguredNetworkGroup_shutdown] CHECK_GRPC_STATUS failed with error code: 4.
[2025-05-22 09:08:29.508] [2368] [HailoRT] [warning] [hailort_rpc_client.cpp:647] [ConfiguredNetworkGroup_shutdown] Make sure HailoRT service is enabled and active!
[2025-05-22 09:08:29.509] [2368] [HailoRT] [error] [network_group_client.cpp:258] [shutdown] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77) - Failed to shutdown
[2025-05-22 09:08:29.509] [2368] [HailoRT] [error] [multi_io_elements.cpp:1032] [execute_terminate] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-22 09:08:29.509] [2368] [HailoRT] [error] [pipeline.cpp:1034] [execute] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-22 09:08:29.509] [2368] [HailoRT] [error] [queue_elements.cpp:599] [execute_terminate] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-22 09:08:29.509] [2368] [HailoRT] [critical] [async_infer_runner.cpp:99] [shutdown] Executing pipeline terminate failed with status HAILO_RPC_FAILED(77)

I can run a simple object detection script ok (using scrfd_2.5g–640x640_quant_hailort_hailo8_1) both with and without the hailort service being enabled. I tried running the script that includes the face recognition model (arcface_mobilefacenet–112x112_quant_hailort_hailo8_1) with the hailort service being disabled and the script just hangs and doesn’t even open the cv window:

(venv_hailo_rpi5_examples) conorroche@picam1:~/hailo-rpi5-examples $ python digirum_rec.py 
[0:29:23.565295098] [5230]  INFO Camera camera_manager.cpp:326 libcamera v0.5.0+59-d83ff0a4
[0:29:23.572329829] [5259]  INFO RPI pisp.cpp:720 libpisp version v1.2.1 981977ff21f3 29-04-2025 (14:13:50)
[0:29:23.581458415] [5259]  INFO RPI pisp.cpp:1179 Registered camera /base/axi/pcie@1000120000/rp1/i2c@88000/ov5647@36 to CFE device /dev/media2 and ISP device /dev/media0 using PiSP variant BCM2712_D0
[0:29:23.585916254] [5230]  INFO Camera camera.cpp:1205 configuring streams: (0) 640x480-RGB888 (1) 640x480-GBRG_PISP_COMP1
[0:29:23.586060106] [5259]  INFO RPI pisp.cpp:1483 Sensor: /base/axi/pcie@1000120000/rp1/i2c@88000/ov5647@36 - Selected sensor format: 640x480-SGBRG10_1X10 - Selected CFE format: 640x480-PC1g

So yes it seems like it is some issue with hailo RPC… i did a fresh reinstall (os + hailo etc) and that didnt solve it..

For reference the script that includes face recognition model is here

conorroche@picam1:~/hailo-rpi5-examples $ cat digirum_rec.py 
import degirum as dg
import numpy as np
import cv2
import time
import logging
from picamera2 import Picamera2

# Define a frame generator: a function that yields frames from the Picamera2
def frame_generator():
    picam2 = Picamera2()

    # Configure the camera (optional: set the resolution or other settings)
    picam2.configure(picam2.create_preview_configuration({'format': 'RGB888'}))

    # Start the camera
    picam2.start()

    try:
        while True:
            # Capture a frame as a numpy array
            frame = picam2.capture_array()


            # Yield the frame
            yield frame
    finally:
        picam2.stop()  # Stop the camera when the generator is closed

def align_and_crop(img, landmarks, image_size=112):
    """
    Align and crop the face from the image based on the given landmarks.

    Args:
        img (np.ndarray): The full image (not the cropped bounding box). This image will be transformed.
        landmarks (List[np.ndarray]): List of 5 keypoints (landmarks) as (x, y) coordinates. These keypoints typically include the eyes, nose, and mouth.
        image_size (int, optional): The size to which the image should be resized. Defaults to 112. It is typically either 112 or 128 for face recognition models.

    Returns:
        Tuple[np.ndarray, np.ndarray]: The aligned face image and the transformation matrix.
    """
    # Define the reference keypoints used in ArcFace model, based on a typical facial landmark set.
    _arcface_ref_kps = np.array(
        [
            [38.2946, 51.6963],  # Left eye
            [73.5318, 51.5014],  # Right eye
            [56.0252, 71.7366],  # Nose
            [41.5493, 92.3655],  # Left mouth corner
            [70.7299, 92.2041],  # Right mouth corner
        ],
        dtype=np.float32,
    )

    # Ensure the input landmarks have exactly 5 points (as expected for face alignment)
    assert len(landmarks) == 5

    # Validate that image_size is divisible by either 112 or 128 (common image sizes for face recognition models)
    assert image_size % 112 == 0 or image_size % 128 == 0

    # Adjust the scaling factor (ratio) based on the desired image size (112 or 128)
    if image_size % 112 == 0:
        ratio = float(image_size) / 112.0
        diff_x = 0  # No horizontal shift for 112 scaling
    else:
        ratio = float(image_size) / 128.0
        diff_x = 8.0 * ratio  # Horizontal shift for 128 scaling

    # Apply the scaling and shifting to the reference keypoints
    dst = _arcface_ref_kps * ratio
    dst[:, 0] += diff_x  # Apply the horizontal shift

    # Estimate the similarity transformation matrix to align the landmarks with the reference keypoints
    M, inliers = cv2.estimateAffinePartial2D(np.array(landmarks), dst, ransacReprojThreshold=1000)
    assert np.all(inliers == True)

    # Apply the affine transformation to the input image to align the face
    aligned_img = cv2.warpAffine(img, M, (image_size, image_size), borderValue=0.0)

    return aligned_img, M


# Specify the model name 
face_det_model_name = "scrfd_2.5g--640x640_quant_hailort_hailo8_1"
face_rec_model_name = "arcface_mobilefacenet--112x112_quant_hailort_hailo8_1"

# Specify the inference host address
#inference_host_address = "@cloud"  # Use "@cloud" for cloud inference
inference_host_address = "@local"  # Use "@local" for local inference

# Specify the zoo_url
#zoo_url = "degirum/models_hailort"
zoo_url = "/home/conorroche/models"  # For local model files

token = ''  # Leave empty for local inference

# Load the face detection model
face_det_model = dg.load_model(
    model_name=face_det_model_name,
    inference_host_address=inference_host_address,
    zoo_url=zoo_url,
    token=token, 
    overlay_color=(0, 255, 0)  # Green color for bounding boxes
)

face_rec_model = dg.load_model(
    model_name=face_rec_model_name,
    inference_host_address=inference_host_address,
    zoo_url=zoo_url,
    token=token,
    overlay_color=(0, 255, 0)  # Green color for bounding boxes
)

for result in face_det_model.predict_batch(frame_generator()):
    aligned_faces = []
    if result.results:
        for face in result.results:
            landmarks = [landmark["landmark"] for landmark in face["landmarks"]]
            aligned_face, _ = align_and_crop(result.image, landmarks)  # Align and crop face
            aligned_faces.append(aligned_face)
        for face, face_embedding in zip(result.results, face_rec_model.predict_batch(aligned_faces)):
            embedding = face_embedding.results[0]["data"][0]  # Extract embedding
    
    cv2.imshow("AI Inference", result.image_overlay)
   
     # Process GUI events and break the loop if 'q' key was pressed
    if cv2.waitKey(1) & 0xFF == ord("q"):
        break

# Destroy any remaining OpenCV windows after the loop finishes
cv2.destroyAllWindows()

Hi @Vlad_Klimov

fyi I can run that model via the cli using the hailo multi process service ok

conorroche@picam1:~/models $ hailortcli run arcface_mobilefacenet--112x112_quant_hailort_hailo8_1/arcface_mobilefacenet--112x112_quant_hailort_hailo8_1.hef -t30 --multi-process-service
Running streaming inference (arcface_mobilefacenet--112x112_quant_hailort_hailo8_1/arcface_mobilefacenet--112x112_quant_hailort_hailo8_1.hef):
  Transform data: true
    Type:      auto
    Quantized: true
Network arcface_mobilefacenet/arcface_mobilefacenet: 100% | 52210 | FPS: 1740.28 | ETA: 00:00:00
> Inference result:
 Network group: arcface_mobilefacenet
    Frames count: 52210
    FPS: 1740.28

Hi @Vlad_Klimov

One last thing, I have just tried using the predict instead of predict_batch function of the models and that appears to not need the multi process service, i can get inference results from the face rec model ok, e.g. the below works with the service stopped.

for frame in frame_generator():
    result = face_det_model.predict(frame)

    if result.results:
        for face in result.results:
            landmarks = [landmark["landmark"] for landmark in face["landmarks"]]
            aligned_face, _ = align_and_crop(result.image, landmarks)  # Align and crop face       
            rec_result = face_rec_model.predict(aligned_face) 

@Conor_Roche,

Thank you for your information - it is very helpful. Our team will try to reproduce your problem and will suggest a solution. Will keep you posted.

Hi,
I am looking to develop this face-recognition system using the raspberrypi+hailo8 setup and i am new to this
Is there any userguide to install the degirum in my setup and i want to use the inference_host_address as local address.My ultimate goal is to run the face recognition in my setup.
Is there any in detailed user manual starting from setup to end.

Hi @Dendukuri_Narendra_v
The guide here is pretty comprehensive and covers all steps. Maybe you can provide a little more info on where you are getting stuck. If you are looking for steps to install degirum and basic examples you can look at: DeGirum/hailo_examples: DeGirum PySDK with Hailo AI Accelerators

Thank you for the link
I got stuck at the beginning only.I have the setup (raspberry pi5+hailo8).
now i need to run the face _recongition in local(i dont want to use cloud)}
Can you provide me the steps that i need to follow to do.
As for my setup i dont want to install the github setup and all that.
Help me out by summarizing the detailed steps that i need to follow to achieve my goal with my setup.

Hi Vlad,
It pass through import picamera2 after rebuilt degirum_env on system level. However, I have the following error when I try to run inference.
TypeError Traceback (most recent call last)
Cell In[1], line 33
30 picam2.stop()
32 # Run inference and display
—> 33 for result in model.predict_batch(picamera2_frame_generator(rotate=True)):
34 cv2.imshow(ā€œAI Inference PiCamera2ā€, result.image_overlay)
35 if cv2.waitKey(1) & 0xFF == ord(ā€œqā€):

TypeError: picamera2_frame_generator() got an unexpected keyword argument ā€˜rotate’

I want to use the .hef files that is there in my local folder this code.Shall i mention the path of that .hef file here

@Simon_Ho
Can you share your picamera2_frame_generator function? Looks like it is not defined to accept rotate as an argument.

@Dendukuri_Narendra_v
A model in PySDK is made of multiple files: the .hef file, the model JSON, and an optional labels file. You can download the model assets from our AI hub

Hi Shashi,
I just tried to run the code in your guideline.

import cv2
import degirum as dg
import numpy as np
from picamera2 import Picamera2

your_model_name = "scrfd_10g--640x640_quant_hailort_hailo8l_1"
your_host_address = "@cloud" # Can be dg.CLOUD, host:port, or dg.LOCAL
your_model_zoo = "degirum/hailo"
your_token = "<token>"

# Load the model
model = dg.load_model(
    model_name = your_model_name, 
    inference_host_address = your_host_address, 
    zoo_url = your_model_zoo, 
    token = your_token 
    # optional parameters, such as overlay_show_probabilities = True
)

# Define frame generator using Picamera2
def picamera2_frame_generator():
    picam2 = Picamera2()
    picam2.configure(picam2.preview_configuration(main={"format": 'BGR888'}))
    picam2.start()
    try:
        while True:
            frame = picam2.capture_array()
            yield frame
    finally:
        picam2.stop()

# Run inference and display
for result in model.predict_batch(picamera2_frame_generator(rotate=True)):
    cv2.imshow("AI Inference PiCamera2", result.image_overlay)
    if cv2.waitKey(1) & 0xFF == ord("q"):
        break

cv2.destroyAllWindows()

Thank you for all the code and the support that you provided
Let me wrap my thoughts.For now i am running the scrfd_model only okay.
1.I installed the raspberry pi+Hailo8 chip with me and installed the book worm os on that
2.I installed the hailo in this setup

sudo apt update
sudo apt install hailo-all
sudo reboot

3.I installed degirum
pip3 install degirum-pysdk degirum-tools
4.Now to run the scrfd_10g model i had dowloaded the .hef file for the hailo8 and kept in this code
i

mport degirum as dg

# Path to your local .hef file
hef_path = "/path/to/your/model.hef"

# Load the model from local .hef file for local inference
model = dg.load_model(
    hef=hef_path,
    inference_host_address='@local'
)
# Prepare input image or array as per model input requirements (e.g. numpy array)
# For example, if your model expects a (1, 3, H, W) tensor or image array:
# image_array = your_preprocessing_function("path_to_input_image.jpg")



# Run inference
results = model(image_array)

print(results)

Now here did i missed anything.
It will be a great help for me does the steps that i am following is correct or not

Thank you so much
So what i am thinking to do is that i will dowload the scrfd model from the AI hub which included the
1..hef file
2)labels.json
3)dequantize.py

So i will paste this folder path containing all this files

# zoo_url = "<path to local folder>"

I hope this is the correct way to use the inference_host_adress=ā€œ@localā€ right
Thank you

Hi @Simon_Ho
We will fix the guide. There is an extra argument rotate=True that is not supported. Please modify as below:

# Run inference and display
for result in model.predict_batch(picamera2_frame_generator):
    cv2.imshow("AI Inference PiCamera2", result.image_overlay)
    if cv2.waitKey(1) & 0xFF == ord("q"):
        break

Im facing the same issue, any updates? @shashi @Vlad_Klimov I’ll appreciate your help

@Vlad_Klimov

This is my Hailort.log last minutes logs before freezing

2025-05-27 17:27:32.327] [3607] [HailoRT] [info] [vdevice.cpp:523] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: false
[2025-05-27 17:27:32.439] [3607] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-27 17:27:32.439] [3607] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-27 17:27:32.443] [3607] [HailoRT] [info] [internal_buffer_manager.cpp:204] [print_execution_results] Planned internal buffer memory: CMA memory 0, user memory 50688. memory to edge layer usage factor is 1
[2025-05-27 17:27:32.443] [3607] [HailoRT] [info] [internal_buffer_manager.cpp:212] [print_execution_results] Default Internal buffer planner executed successfully
[2025-05-27 17:27:32.455] [3607] [HailoRT] [info] [device_internal.cpp:57] [configure] Configuring HEF took 12.843519 milliseconds
[2025-05-27 17:27:32.455] [3607] [HailoRT] [info] [vdevice.cpp:749] [configure] Configuring HEF on VDevice took 16.657237 milliseconds
[2025-05-27 17:27:32.455] [3607] [HailoRT] [info] [infer_model.cpp:436] [configure] Configuring network group ā€˜arcface_mobilefacenet’ with params: batch size: 0, power mode: ULTRA_PERFORMANCE, latency: NONE
[2025-05-27 17:27:32.455] [3607] [HailoRT] [info] [multi_io_elements.cpp:756] [create] Created (AsyncHwEl)
[2025-05-27 17:27:32.457] [3607] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (EntryPushQEl0arcface_mobilefacenet/input_layer1 | timeout: 10s)
[2025-05-27 17:27:32.457] [3607] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0AsyncHwEl)
[2025-05-27 17:27:32.457] [3607] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] EntryPushQEl0arcface_mobilefacenet/input_layer1 | inputs: user | outputs: AsyncHwEl(running in thread_id: 3618)
[2025-05-27 17:27:32.457] [3607] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] AsyncHwEl | inputs: EntryPushQEl0arcface_mobilefacenet/input_layer1[0] | outputs: LastAsyncEl0AsyncHwEl
[2025-05-27 17:27:32.457] [3607] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-05-27 17:27:32.457] [3607] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-27 17:27:32.457] [3607] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: arcface_mobilefacenet
[2025-05-27 17:27:32.479] [3621] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-27 17:27:32.487] [3621] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-27 17:27:32.489] [3621] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-27 17:27:32.522] [3621] [HailoRT] [info] [vdevice.cpp:523] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: true
[2025-05-27 17:27:32.523] [3621] [HailoRT] [error] [hailort_rpc_client.cpp:48] [get_service_version] CHECK_GRPC_STATUS failed with error code: 14.
[2025-05-27 17:27:32.523] [3621] [HailoRT] [warning] [hailort_rpc_client.cpp:48] [get_service_version] Make sure HailoRT service is enabled and active!
[2025-05-27 17:27:32.523] [3621] [HailoRT] [error] [rpc_client_utils.hpp:138] [init_client_service_communication_impl] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-27 17:27:32.523] [3621] [HailoRT] [error] [rpc_client_utils.hpp:81] [init_client_service_communication] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-27 17:27:32.523] [3621] [HailoRT] [error] [vdevice.cpp:340] [create] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-27 17:27:32.523] [3621] [HailoRT] [error] [vdevice.cpp:535] [create] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-27 17:27:32.523] [3621] [HailoRT] [info] [vdevice.cpp:523] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: false
[2025-05-27 17:27:32.585] [3621] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: yolov8n_relu6_age
[2025-05-27 17:27:32.585] [3621] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: yolov8n_relu6_age
[2025-05-27 17:27:32.586] [3621] [HailoRT] [info] [buffer_requirements.cpp:195] [find_initial_desc_page_size] Using non-default initial_desc_page_size of 64, due to a small transfer size (8)
[2025-05-27 17:27:32.590] [3621] [HailoRT] [info] [internal_buffer_manager.cpp:202] [print_execution_results] Default Internal buffer planner failed to meet requirements
[2025-05-27 17:27:32.590] [3621] [HailoRT] [info] [internal_buffer_manager.cpp:212] [print_execution_results] Default Internal buffer planner executed successfully
[2025-05-27 17:27:32.593] [3621] [HailoRT] [info] [device_internal.cpp:57] [configure] Configuring HEF took 8.200997 milliseconds
[2025-05-27 17:27:32.593] [3621] [HailoRT] [info] [vdevice.cpp:749] [configure] Configuring HEF on VDevice took 8.435 milliseconds
[2025-05-27 17:27:32.593] [3621] [HailoRT] [info] [infer_model.cpp:436] [configure] Configuring network group ā€˜yolov8n_relu6_age’ with params: batch size: 0, power mode: ULTRA_PERFORMANCE, latency: NONE
[2025-05-27 17:27:32.593] [3621] [HailoRT] [info] [multi_io_elements.cpp:756] [create] Created (AsyncHwEl)
[2025-05-27 17:27:32.593] [3621] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (EntryPushQEl0yolov8n_relu6_age/input_layer1 | timeout: 10s)
[2025-05-27 17:27:32.593] [3621] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl0AsyncHwEl | timeout: 10s)
[2025-05-27 17:27:32.593] [3621] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl0AsyncHwEl | Padding Periph shape - src_shape: (1, 1, 1), dst_shape: (1, 1, 1))
[2025-05-27 17:27:32.593] [3621] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl0AsyncHwEl)
[2025-05-27 17:27:32.593] [3621] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] EntryPushQEl0yolov8n_relu6_age/input_layer1 | inputs: user | outputs: AsyncHwEl(running in thread_id: 3629)
[2025-05-27 17:27:32.593] [3621] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] AsyncHwEl | inputs: EntryPushQEl0yolov8n_relu6_age/input_layer1[0] | outputs: PushQEl0AsyncHwEl
[2025-05-27 17:27:32.593] [3621] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl0AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl0AsyncHwEl(running in thread_id: 3630)
[2025-05-27 17:27:32.594] [3621] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl0AsyncHwEl | inputs: PushQEl0AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl0AsyncHwEl
[2025-05-27 17:27:32.594] [3621] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl0AsyncHwEl | inputs: PostInferEl0AsyncHwEl[0] | outputs: user
[2025-05-27 17:27:32.594] [3621] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: yolov8n_relu6_age
[2025-05-27 17:27:32.594] [3621] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: yolov8n_relu6_age
[2025-05-27 17:27:32.614] [3634] [HailoRT] [info] [device.cpp:49] [Device] OS Version: Linux 6.12.25+rpt-rpi-2712 #1 SMP PREEMPT Debian 1:6.12.25-1+rpt1 (2025-04-30) aarch64
[2025-05-27 17:27:32.623] [3634] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-27 17:27:32.631] [3634] [HailoRT] [info] [control.cpp:108] [control__parse_identify_results] firmware_version is: 4.20.0
[2025-05-27 17:27:32.666] [3634] [HailoRT] [info] [vdevice.cpp:523] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: true
[2025-05-27 17:27:32.667] [3634] [HailoRT] [error] [hailort_rpc_client.cpp:48] [get_service_version] CHECK_GRPC_STATUS failed with error code: 14.
[2025-05-27 17:27:32.667] [3634] [HailoRT] [warning] [hailort_rpc_client.cpp:48] [get_service_version] Make sure HailoRT service is enabled and active!
[2025-05-27 17:27:32.667] [3634] [HailoRT] [error] [rpc_client_utils.hpp:138] [init_client_service_communication_impl] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-27 17:27:32.667] [3634] [HailoRT] [error] [rpc_client_utils.hpp:81] [init_client_service_communication] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-27 17:27:32.667] [3634] [HailoRT] [error] [vdevice.cpp:340] [create] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-27 17:27:32.667] [3634] [HailoRT] [error] [vdevice.cpp:535] [create] CHECK_SUCCESS failed with status=HAILO_RPC_FAILED(77)
[2025-05-27 17:27:32.667] [3634] [HailoRT] [info] [vdevice.cpp:523] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: false
[2025-05-27 17:27:32.734] [3634] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: yolov8n_relu6_fairface_gender
[2025-05-27 17:27:32.734] [3634] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: yolov8n_relu6_fairface_gender
[2025-05-27 17:27:32.736] [3634] [HailoRT] [info] [buffer_requirements.cpp:195] [find_initial_desc_page_size] Using non-default initial_desc_page_size of 64, due to a small transfer size (8)
[2025-05-27 17:27:32.736] [3634] [HailoRT] [info] [internal_buffer_manager.cpp:202] [print_execution_results] Default Internal buffer planner failed to meet requirements
[2025-05-27 17:27:32.736] [3634] [HailoRT] [info] [internal_buffer_manager.cpp:212] [print_execution_results] Default Internal buffer planner executed successfully
[2025-05-27 17:27:32.745] [3634] [HailoRT] [info] [device_internal.cpp:57] [configure] Configuring HEF took 11.425044 milliseconds
[2025-05-27 17:27:32.745] [3634] [HailoRT] [info] [vdevice.cpp:749] [configure] Configuring HEF on VDevice took 11.659768 milliseconds
[2025-05-27 17:27:32.745] [3634] [HailoRT] [info] [infer_model.cpp:436] [configure] Configuring network group ā€˜yolov8n_relu6_fairface_gender’ with params: batch size: 0, power mode: ULTRA_PERFORMANCE, latency: NONE
[2025-05-27 17:27:32.745] [3634] [HailoRT] [info] [multi_io_elements.cpp:756] [create] Created (AsyncHwEl)
[2025-05-27 17:27:32.745] [3634] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (EntryPushQEl0yolov8n_relu6_fairface_gender/input_layer1 | timeout: 10s)
[2025-05-27 17:27:32.746] [3634] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl0AsyncHwEl | timeout: 10s)
[2025-05-27 17:27:32.746] [3634] [HailoRT] [info] [filter_elements.cpp:375] [create] Created (PostInferEl0AsyncHwEl | Padding Periph shape - src_shape: (1, 1, 2), dst_shape: (1, 1, 2))
[2025-05-27 17:27:32.746] [3634] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0PostInferEl0AsyncHwEl)
[2025-05-27 17:27:32.746] [3634] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] EntryPushQEl0yolov8n_relu6_fairface_gender/input_layer1 | inputs: user | outputs: AsyncHwEl(running in thread_id: 3641)
[2025-05-27 17:27:32.746] [3634] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] AsyncHwEl | inputs: EntryPushQEl0yolov8n_relu6_fairface_gender/input_layer1[0] | outputs: PushQEl0AsyncHwEl
[2025-05-27 17:27:32.746] [3634] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl0AsyncHwEl | inputs: AsyncHwEl[0] | outputs: PostInferEl0AsyncHwEl(running in thread_id: 3642)
[2025-05-27 17:27:32.746] [3634] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PostInferEl0AsyncHwEl | inputs: PushQEl0AsyncHwEl[0] | outputs: LastAsyncEl0PostInferEl0AsyncHwEl
[2025-05-27 17:27:32.746] [3634] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0PostInferEl0AsyncHwEl | inputs: PostInferEl0AsyncHwEl[0] | outputs: user
[2025-05-27 17:27:32.746] [3634] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: yolov8n_relu6_fairface_gender
[2025-05-27 17:27:32.746] [3634] [HailoRT] [info] [hef.cpp:1929] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: yolov8n_relu6_fairface_gender
[2025-05-27 17:33:02.016] [3551] [HailoRT] [error] [infer_model.cpp:1048] [wait] CHECK failed - Waiting for async job to finish has failed with timeout (5000ms)
[2025-05-27 17:33:02.016] [3551] [HailoRT] [error] [infer_model.cpp:1021] [wait] CHECK_SUCCESS failed with status=HAILO_TIMEOUT(4)

Could it be caused by incompatibility with the raspbian os kernel version? I have one device runing fine with kernel 6.12.20 I guess…( need to confirm) and the other one with 6.12.25 is the one is failing. I’ll confirm in couple hours