C/++ example of batch size > 1 please

Hi,
I’m not managing to get inference running with batch size != 1.
From the name of the function “infer_model->set_batch_size(batchSize)”, it would seem that this is the obvious way to do it, but I get a size mismatch on the output buffer when I try to do this.
Here is my code, in a single file:

In my example code batchSize = 1, and that works. But raising it causes a HailoRT failure when I try to start the inference job.

Is there a sample I can learn from somewhere? I’m not finding anything.

Hey @rogojin,

The API to use for setting the batch size is as follows:

Inference Model Initialization And Configuration
The example creates an inference model based on the given HEF file, using the function
hailort::VDevice::create_infer_model(). The example then sets the output format type to HAILO_FORMAT_TYPE_FLOAT32 using hailort::InferModel:
:InferStream::set_format_type(), sets the batch size to BATCH_SIZE (defined at the top of thie example) using hailort::InferModel::InferStream::
set_batch_size() and following this the example configures the inference model using the
function hailort::InferModel::configure().

For more information, please check out the HailoRT Guidelines, as this example is on page 60 of the documentation.

The documentation can be found here: https://hailo.ai/developer-zone/documentation/

Best regards,

Thanks, I should have expected a comprehensive user doc like this, but just assumed it didn’t exist.

1 Like