Run_async call blocking

I am running the code very similar to the following example on our camera backend.

from threading import Thread
import time
import numpy as np
from functools import partial
from hailo_platform import VDevice, HailoSchedulingAlgorithm, FormatType, pyhailort
import os
import multiprocessing

number_of_frames = 1000000
timeout_ms = 10000
os.environ['HAILO_MONITOR'] = '1'

def example_callback(completion_info, bindings=None, start_time=None):
    end_time = time.monotonic()

def infer(should_use_multi_process_service=False, model_path=None):
    # Create a VDevice
    params = VDevice.create_params()
    params.scheduling_algorithm = HailoSchedulingAlgorithm.ROUND_ROBIN
    params.group_id = "SHARED"
    if should_use_multi_process_service:
        params.multi_process_service = should_use_multi_process_service

    with VDevice(params) as vdevice:
        hef = pyhailort.pyhailort.HEF(model_path)
        network_group_names = hef.get_network_group_names()

        # Create an infer model from an HEF:
        infer_model = vdevice.create_infer_model(model_path, name=network_group_names[0])

        # Set optional infer model parameters
        infer_model.set_batch_size(1)

        # For a single input / output model, the input / output object
        # can be accessed with a name parameter ...
        if "model.hef"  in model_path:
            infer_model.input("model/input_layer1").set_format_type(FormatType.UINT8)
            infer_model.output().set_format_type(FormatType.FLOAT32)
        else:
            infer_model.input("vits_indoor_224_224/input_layer1").set_format_type(FormatType.UINT8)
            infer_model.output().set_format_type(FormatType.UINT8)

        # Once the infer model is set, configure the infer model
        with infer_model.configure() as configured_infer_model:
            for _ in range(number_of_frames):
                # Create bindings for it and set buffers
                bindings = configured_infer_model.create_bindings()
                bindings.input().set_buffer(np.empty(infer_model.input().shape).astype(np.uint8))
                bindings.output().set_buffer(np.empty(infer_model.output().shape).astype(np.float32 if "model.hef" in model_path else np.uint8))

                # Wait for the async pipeline to be ready, and start an async inference job
                configured_infer_model.wait_for_async_ready(timeout_ms=10000)

                # Any callable can be passed as callback (lambda, function, functools.partial), as long
                # as it has a keyword argument "completion_info"
                job = configured_infer_model.run_async([bindings], partial(example_callback, bindings=bindings))

            # Wait for the last job
            job.wait(timeout_ms)

def multiprocess_target(should_use_multi_process_service, model_path):
    pool = [
        Thread(target=infer, args=(False, "model.hef")),
        Thread(target=infer, args=(False, "vits_indoor_224_224.hef")),
    ]

    print('Starting async inference on multiple models using threads')

    for job in pool:
        job.start()
    time.sleep(1)
    for job in pool:
        job.join()

my_process = multiprocessing.Process(target=multiprocess_target, args=(False, None))
my_process.start()
my_process.join()

This seems to work when I run the script as is. But sometimes, when running this code in a different shell or running a similar example in our camera backend (code not shared here) I am not getting any python exceptions, but the run_async calls for both models are blocking. Our code, similar to the above example is operating as follows:

Within my hailort.log, I am getting the following:

[2025-08-14 16:41:14.193] [2979] [HailoRT] [info] [hef.cpp:1994] [get_network[2025-08-14 16:41:44.605] [3242] [HailoRT] [info] [vdevice.cpp:535] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: false
[2025-08-14 16:41:44.618] [3242] [HailoRT] [info] [device.cpp:51] [Device] OS Version: Linux 6.6.31+rpt-rpi-v8 #1 SMP PREEMPT Debian 1:6.6.31-1+rpt1 (2024-05-29) aarch64
[2025-08-14 16:41:44.619] [3242] [HailoRT] [info] [control.cpp:113] [control__parse_identify_results] firmware_version is: 4.21.0
[2025-08-14 16:41:44.619] [3242] [HailoRT] [info] [vdevice.cpp:681] [create] VDevice Infos: 0000:01:00.0
[2025-08-14 16:41:44.734] [3242] [HailoRT] [info] [hef.cpp:1994] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: model
[2025-08-14 16:41:44.734] [3242] [HailoRT] [info] [hef.cpp:1994] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: model
[2025-08-14 16:41:44.735] [3243] [HailoRT] [info] [vdevice.cpp:535] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: false     
[2025-08-14 16:41:44.976] [3243] [HailoRT] [info] [hef.cpp:1994] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: vits_indoor_224_224  
[2025-08-14 16:41:44.976] [3243] [HailoRT] [info] [hef.cpp:1994] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: vits_indoor_224_224  
[2025-08-14 16:41:44.991] [3242] [HailoRT] [info] [internal_buffer_manager.cpp:77] [print_execution_results] Planned internal buffer memory: CMA=0 CMA-Desc=65536 Pinned=1048576. memory to edge layer usage factor is 1
[2025-08-14 16:41:44.991] [3242] [HailoRT] [info] [internal_buffer_manager.cpp:86] [print_execution_results] Default Internal buffer planner executed successfully
[2025-08-14 16:41:45.024] [3242] [HailoRT] [info] [device_internal.cpp:57] [configure] Configuring HEF took 46.866246 milliseconds
[2025-08-14 16:41:45.025] [3242] [HailoRT] [info] [vdevice.cpp:779] [configure] Configuring HEF on VDevice took 47.559338 milliseconds
[2025-08-14 16:41:45.025] [3242] [HailoRT] [info] [infer_model.cpp:417] [configure] Configuring network group 'model' with params: batch size: 1, power mode: PERFORMANCE, latency: NONE        
[2025-08-14 16:41:45.025] [3242] [HailoRT] [info] [multi_io_elements.cpp:754] [create] Created (AsyncHwEl)
[2025-08-14 16:41:45.025] [3242] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (EntryPushQEl0model/input_layer1 | timeout: 10s)
[2025-08-14 16:41:45.025] [3242] [HailoRT] [info] [filter_elements.cpp:101] [create] Created (PreInferEl3model/input_layer1 | Reorder - src_order: NHWC, src_shape: (640, 640, 3), dst_order: NHCW, dst_shape: (640, 640, 3))
[2025-08-14 16:41:45.026] [3242] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl3model/input_layer1 | timeout: 10s)
[2025-08-14 16:41:45.026] [3242] [HailoRT] [info] [multi_io_elements.cpp:135] [create] Created (NmsPPMuxEl0YOLOv5-Post-Process | Op YOLOV5, Name: YOLOv5-Post-Process, Score threshold: 0.200, IoU threshold: 0.60, Classes: 15, Max bboxes per class: 80, Image height: 640, Image width: 640)
[2025-08-14 16:41:45.027] [3242] [HailoRT] [info] [queue_elements.cpp:942] [create] Created (MultiPushQEl0YOLOv5-Post-Process | timeout: 10s)
[2025-08-14 16:41:45.027] [3242] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0NmsPPMuxEl0YOLOv5-Post-Process)
[2025-08-14 16:41:45.027] [3242] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] EntryPushQEl0model/input_layer1 | inputs: user | outputs: PreInferEl3model/input_layer1(running in thread_id: 3264)
[2025-08-14 16:41:45.027] [3242] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PreInferEl3model/input_layer1 | inputs: EntryPushQEl0model/input_layer1[0] | outputs: PushQEl3model/input_layer1
[2025-08-14 16:41:45.028] [3242] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl3model/input_layer1 | inputs: PreInferEl3model/input_layer1[0] | outputs: AsyncHwEl(running in thread_id: 3265)
[2025-08-14 16:41:45.028] [3242] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] AsyncHwEl | inputs: PushQEl3model/input_layer1[0] | outputs: MultiPushQEl0YOLOv5-Post-Process MultiPushQEl0YOLOv5-Post-Process MultiPushQEl0YOLOv5-Post-Process
[2025-08-14 16:41:45.028] [3242] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] MultiPushQEl0YOLOv5-Post-Process | inputs: AsyncHwEl[0] AsyncHwEl[1] AsyncHwEl[2] | outputs: NmsPPMuxEl0YOLOv5-Post-Process(running in thread_id: 3266)
[2025-08-14 16:41:45.028] [3242] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] NmsPPMuxEl0YOLOv5-Post-Process | inputs: MultiPushQEl0YOLOv5-Post-Process[0] | outputs: LastAsyncEl0NmsPPMuxEl0YOLOv5-Post-Process
[2025-08-14 16:41:45.028] [3242] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0NmsPPMuxEl0YOLOv5-Post-Process | inputs: NmsPPMuxEl0YOLOv5-Post-Process[0] | outputs: 
user
[2025-08-14 16:41:45.049] [3243] [HailoRT] [info] [internal_buffer_manager.cpp:77] [print_execution_results] Planned internal buffer memory: CMA=0 CMA-Desc=2424832 Pinned=1502720. memory to edge layer usage factor is 0.29542023
[2025-08-14 16:41:45.049] [3243] [HailoRT] [info] [internal_buffer_manager.cpp:86] [print_execution_results] Default Internal buffer planner executed successfully
[2025-08-14 16:41:45.189] [3243] [HailoRT] [info] [device_internal.cpp:57] [configure] Configuring HEF took 142.628277 milliseconds
[2025-08-14 16:41:45.189] [3243] [HailoRT] [info] [vdevice.cpp:779] [configure] Configuring HEF on VDevice took 143.091759 milliseconds
[2025-08-14 16:41:45.189] [3243] [HailoRT] [info] [infer_model.cpp:417] [configure] Configuring network group 'vits_indoor_224_224' with params: batch size: 1, power mode: PERFORMANCE, latency: NONE
[2025-08-14 16:41:45.189] [3243] [HailoRT] [info] [multi_io_elements.cpp:754] [create] Created (AsyncHwEl)
[2025-08-14 16:41:45.190] [3243] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (EntryPushQEl0vits_indoor_224_224/input_layer1 | timeout: 10s)
[2025-08-14 16:41:45.190] [3243] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0AsyncHwEl)
[2025-08-14 16:41:45.190] [3243] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] EntryPushQEl0vits_indoor_224_224/input_layer1 | inputs: user | outputs: AsyncHwEl(running in thread_id: 3267)
[2025-08-14 16:41:45.190] [3243] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] AsyncHwEl | inputs: EntryPushQEl0vits_indoor_224_224/input_layer1[0] | outputs: LastAsyncEl0AsyncHwEl
[2025-08-14 16:41:45.190] [3243] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-08-14 16:41:45.244] [3242] [HailoRT] [info] [hef.cpp:1994] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: model
[2025-08-14 16:41:45.314] [3268] [HailoRT] [info] [vdevice.cpp:535] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: false     
[2025-08-14 16:41:45.366] [3268] [HailoRT] [info] [hef.cpp:1994] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: model
[2025-08-14 16:41:45.366] [3268] [HailoRT] [info] [hef.cpp:1994] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: model
[2025-08-14 16:41:45.429] [3269] [HailoRT] [info] [vdevice.cpp:535] [create] Creating vdevice with params: device_count: 1, scheduling_algorithm: ROUND_ROBIN, multi_process_service: false     
[2025-08-14 16:41:45.656] [3269] [HailoRT] [info] [hef.cpp:1994] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: vits_indoor_224_224  
[2025-08-14 16:41:45.656] [3269] [HailoRT] [info] [hef.cpp:1994] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: vits_indoor_224_224  
[2025-08-14 16:41:45.658] [3269] [HailoRT] [info] [vdevice.cpp:779] [configure] Configuring HEF on VDevice took 0.467111 milliseconds
[2025-08-14 16:41:45.658] [3269] [HailoRT] [info] [infer_model.cpp:417] [configure] Configuring network group 'vits_indoor_224_224' with params: batch size: 1, power mode: PERFORMANCE, latency: NONE
[2025-08-14 16:41:45.658] [3269] [HailoRT] [info] [multi_io_elements.cpp:754] [create] Created (AsyncHwEl)
[2025-08-14 16:41:45.684] [3269] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (EntryPushQEl0vits_indoor_224_224/input_layer1 | timeout: 10s)
[2025-08-14 16:41:45.684] [3269] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0AsyncHwEl)
[2025-08-14 16:41:45.684] [3269] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] EntryPushQEl0vits_indoor_224_224/input_layer1 | inputs: user | outputs: AsyncHwEl(running in thread_id: 3270)
[2025-08-14 16:41:45.684] [3269] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] AsyncHwEl | inputs: EntryPushQEl0vits_indoor_224_224/input_layer1[0] | outputs: LastAsyncEl0AsyncHwEl
[2025-08-14 16:41:45.684] [3269] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0AsyncHwEl | inputs: AsyncHwEl[0] | outputs: user
[2025-08-14 16:41:45.686] [3268] [HailoRT] [info] [vdevice.cpp:779] [configure] Configuring HEF on VDevice took 0.444649 milliseconds
[2025-08-14 16:41:45.686] [3268] [HailoRT] [info] [infer_model.cpp:417] [configure] Configuring network group 'model' with params: batch size: 1, power mode: PERFORMANCE, latency: NONE        
[2025-08-14 16:41:45.686] [3268] [HailoRT] [info] [multi_io_elements.cpp:754] [create] Created (AsyncHwEl)
[2025-08-14 16:41:45.686] [3268] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (EntryPushQEl0model/input_layer1 | timeout: 10s)
[2025-08-14 16:41:45.686] [3268] [HailoRT] [info] [filter_elements.cpp:101] [create] Created (PreInferEl3model/input_layer1 | Reorder - src_order: NHWC, src_shape: (640, 640, 3), dst_order: NHCW, dst_shape: (640, 640, 3))
[2025-08-14 16:41:45.687] [3268] [HailoRT] [info] [queue_elements.cpp:450] [create] Created (PushQEl3model/input_layer1 | timeout: 10s)
[2025-08-14 16:41:45.687] [3268] [HailoRT] [info] [multi_io_elements.cpp:135] [create] Created (NmsPPMuxEl0YOLOv5-Post-Process | Op YOLOV5, Name: YOLOv5-Post-Process, Score threshold: 0.200, IoU threshold: 0.60, Classes: 15, Max bboxes per class: 80, Image height: 640, Image width: 640)
[2025-08-14 16:41:45.688] [3268] [HailoRT] [info] [queue_elements.cpp:942] [create] Created (MultiPushQEl0YOLOv5-Post-Process | timeout: 10s)
[2025-08-14 16:41:45.688] [3268] [HailoRT] [info] [edge_elements.cpp:187] [create] Created (LastAsyncEl0NmsPPMuxEl0YOLOv5-Post-Process)
[2025-08-14 16:41:45.688] [3268] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] EntryPushQEl0model/input_layer1 | inputs: user | outputs: PreInferEl3model/input_layer1(running in thread_id: 3272)
[2025-08-14 16:41:45.688] [3268] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PreInferEl3model/input_layer1 | inputs: EntryPushQEl0model/input_layer1[0] | outputs: PushQEl3model/input_layer1
[2025-08-14 16:41:45.688] [3268] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] PushQEl3model/input_layer1 | inputs: PreInferEl3model/input_layer1[0] | outputs: AsyncHwEl(running in thread_id: 3273)
[2025-08-14 16:41:45.688] [3268] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] AsyncHwEl | inputs: PushQEl3model/input_layer1[0] | outputs: MultiPushQEl0YOLOv5-Post-Process MultiPushQEl0YOLOv5-Post-Process MultiPushQEl0YOLOv5-Post-Process
[2025-08-14 16:41:45.688] [3268] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] MultiPushQEl0YOLOv5-Post-Process | inputs: AsyncHwEl[0] AsyncHwEl[1] AsyncHwEl[2] | outputs: NmsPPMuxEl0YOLOv5-Post-Process(running in thread_id: 3274)
[2025-08-14 16:41:45.688] [3268] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] NmsPPMuxEl0YOLOv5-Post-Process | inputs: MultiPushQEl0YOLOv5-Post-Process[0] | outputs: LastAsyncEl0NmsPPMuxEl0YOLOv5-Post-Process
[2025-08-14 16:41:45.688] [3268] [HailoRT] [info] [pipeline.cpp:891] [print_deep_description] LastAsyncEl0NmsPPMuxEl0YOLOv5-Post-Process | inputs: NmsPPMuxEl0YOLOv5-Post-Process[0] | outputs: 
user
[2025-08-14 16:41:45.757] [3268] [HailoRT] [info] [hef.cpp:1994] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: model
[2025-08-14 16:41:45.837] [3243] [HailoRT] [info] [async_infer_runner.cpp:86] [shutdown] Pipeline was aborted. Shutting it down
[2025-08-14 16:41:45.838] [3243] [HailoRT] [info] [async_infer_runner.cpp:86] [shutdown] Pipeline was aborted. Shutting it down
[2025-08-14 16:41:45.838] [3243] [HailoRT] [info] [queue_elements.cpp:570] [execute_deactivate] enqueue() in element EntryPushQEl0vits_indoor_224_224/input_layer1 was aborted, got status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)
[2025-08-14 16:41:45.838] [3243] [HailoRT] [info] [queue_elements.cpp:46] [~BaseQueueElement] Queue element EntryPushQEl0vits_indoor_224_224/input_layer1 has 0 frames in his Queue on destruction
[2025-08-14 16:41:45.910] [3269] [HailoRT] [info] [async_infer_runner.cpp:86] [shutdown] Pipeline was aborted. Shutting it down
[2025-08-14 16:41:45.912] [3269] [HailoRT] [info] [async_infer_runner.cpp:86] [shutdown] Pipeline was aborted. Shutting it down
[2025-08-14 16:41:45.912] [3269] [HailoRT] [info] [queue_elements.cpp:570] [execute_deactivate] enqueue() in element EntryPushQEl0vits_indoor_224_224/input_layer1 was aborted, got status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)
[2025-08-14 16:41:45.912] [3269] [HailoRT] [info] [queue_elements.cpp:46] [~BaseQueueElement] Queue element EntryPushQEl0vits_indoor_224_224/input_layer1 has 0 frames in his Queue on destruction
[2025-08-14 16:41:45.936] [3242] [HailoRT] [info] [async_infer_runner.cpp:86] [shutdown] Pipeline was aborted. Shutting it down
[2025-08-14 16:41:45.937] [3242] [HailoRT] [info] [async_infer_runner.cpp:86] [shutdown] Pipeline was aborted. Shutting it down
[2025-08-14 16:41:45.937] [3242] [HailoRT] [info] [queue_elements.cpp:1131] [execute_deactivate] enqueue() in element MultiPushQEl0YOLOv5-Post-Process was aborted, got status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)
[2025-08-14 16:41:45.937] [3242] [HailoRT] [info] [queue_elements.cpp:1131] [execute_deactivate] enqueue() in element MultiPushQEl0YOLOv5-Post-Process was aborted, got status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)
[2025-08-14 16:41:45.937] [3242] [HailoRT] [info] [queue_elements.cpp:1131] [execute_deactivate] enqueue() in element MultiPushQEl0YOLOv5-Post-Process was aborted, got status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)
[2025-08-14 16:41:45.937] [3242] [HailoRT] [info] [queue_elements.cpp:570] [execute_deactivate] enqueue() in element PushQEl3model/input_layer1 was aborted, got status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)
[2025-08-14 16:41:45.937] [3242] [HailoRT] [info] [queue_elements.cpp:570] [execute_deactivate] enqueue() in element EntryPushQEl0model/input_layer1 was aborted, got status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)
[2025-08-14 16:41:45.938] [3242] [HailoRT] [info] [queue_elements.cpp:46] [~BaseQueueElement] Queue element EntryPushQEl0model/input_layer1 has 0 frames in his Queue on destruction
[2025-08-14 16:41:45.938] [3242] [HailoRT] [info] [queue_elements.cpp:46] [~BaseQueueElement] Queue element PushQEl3model/input_layer1 has 0 frames in his Queue on destruction
[2025-08-14 16:41:45.944] [3268] [HailoRT] [info] [async_infer_runner.cpp:86] [shutdown] Pipeline was aborted. Shutting it down
[2025-08-14 16:41:45.945] [3268] [HailoRT] [info] [async_infer_runner.cpp:86] [shutdown] Pipeline was aborted. Shutting it down
[2025-08-14 16:41:45.945] [3268] [HailoRT] [info] [queue_elements.cpp:1131] [execute_deactivate] enqueue() in element MultiPushQEl0YOLOv5-Post-Process was aborted, got status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)
[2025-08-14 16:41:45.945] [3268] [HailoRT] [info] [queue_elements.cpp:1131] [execute_deactivate] enqueue() in element MultiPushQEl0YOLOv5-Post-Process was aborted, got status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)
[2025-08-14 16:41:45.945] [3268] [HailoRT] [info] [queue_elemen

Any idea why this would work in some situations but not others? I don’t see anything that would indicate an error, I only see the following info messages that are cause for concern:

[2025-08-14 16:41:45.757] [3268] [HailoRT] [info] [hef.cpp:1994] [get_network_group_and_network_name] No name was given. Addressing all networks of default network_group: model

[2025-08-14 16:41:45.837] [3243] [HailoRT] [info] [async_infer_runner.cpp:86] [shutdown] Pipeline was aborted. Shutting it down

[2025-08-14 16:41:45.838] [3243] [HailoRT] [info] [queue_elements.cpp:570] [execute_deactivate] enqueue() in element EntryPushQEl0vits_indoor_224_224/input_layer1 was aborted, got status = HAILO_SHUTDOWN_EVENT_SIGNALED(57)

We are running on Rasbian Bookworm with kernel version 6.6.31+rpt-rpi-v8

Solved, the error was on our end. Not sure how to delete this post…