However, if I do this, i get:
[HailoRT] [error] CHECK failed - write size 602112 must be 150528
Similar I get the same issue for output.read:
[HailoRT] [error] CHECK failed - Buffer size is not the same as expected for pool! (16000 != 4000)
Do I have to also configure the InputVStream and OutputVStream somehow in order to pass a full batch?
Or is the batch size handled completely inside the Hailo, so regardless of batch size configured, the data has to be passed image by image …?
Hi @Wolfram_Strothmann
The batch size is handled in HailoRT. You should keep pushing frames one at a time. We will manage the batching. Also on the output, you will get your inference output one at a time. This way you can use the same code w/o worrying about batch size. In addition we can automatically use “lower” batch size if not enough frames are pushed after some timeout (configurable).