Streaming inference mode in Python throws error

I have a problem with the Streaming inference mode in Python.

I followed the HailoRT User Guide for version 4.20. and on the pages 64 to 66 there are examples how to do inference in Python. The example for the InferPipeline API on page 64 works flawlessly for me and produces the right results. But I cannot get the streaming inference example on page 65 to work. It always crashes after sending less than one hundred example images with the following error message:

    raise self.create_exception_from_status(error_code) from libhailort_exception
hailo_platform.pyhailort.pyhailort.HailoRTTimeout: Received a timeout - hailort has failed because a timeout had occurred
[HailoRT] [error] CHECK failed - Timeout waiting on cond variable
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_TIMEOUT(4)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_TIMEOUT(4) - HwWriteEl14model/input_layer1 (H2D) failed with status=HAILO_TIMEOUT(4)
Process Process-1:

I just copy and pasted the code in the example what can I possibly do wrong here?

Hey @Fabian_Stern,

Welcome to the Hailo Community!

So those errors you’re seeing - the HwWrite … (H2D) failed with status=HAILO_TIMEOUT(4) - basically mean the write from your host to the device got stuck and timed out.

When HailoRT logs Timeout waiting on cond variable, it’s usually because the input DMA couldn’t get the resources it needed. This typically happens when the device isn’t processing data properly - meaning the network isn’t actually running or making progress.

Could you share what inference setup you’re using?

Also, it might help to check out how we set up Python inference in our examples here: