Batch size is related to sending multiple inputs together to inference engine to get higher throughput. But batching is handled by HailoRT at runtime. See Custom batch_size for models. - General - Hailo Community for explanations by @KlausK .
Batch size is related to sending multiple inputs together to inference engine to get higher throughput. But batching is handled by HailoRT at runtime. See Custom batch_size for models. - General - Hailo Community for explanations by @KlausK .