Issue with running multiple output model

Has anyone tried to run inference using a multi output model on hailo8?

Note: I have .hef model after conversion and not directly from training.

Hey @dbiswal

Welcome to the Hailo Community!

Yes Hailo8 does support mulitple output model.
Here is a simplified example in pseudo-code to illustrate the process:

from hailo_platform import VDevice, HailoSchedulingAlgorithm

# Initialize VDevice parameters
params = VDevice.create_params()
params.scheduling_algorithm = HailoSchedulingAlgorithm.ROUND_ROBIN

# Create and configure the VDevice
with VDevice(params) as vdevice:
    # Load the HEF model
    infer_model = vdevice.create_infer_model('path/to/your_model.hef')
    
    # Configure the model
    with infer_model.configure() as configured_infer_model:
        # Create bindings for input and output
        input_binding = configured_infer_model.create_input_binding()
        output_binding_1 = configured_infer_model.create_output_binding("output_1")
        output_binding_2 = configured_infer_model.create_output_binding("output_2")
        
        # Set input and output buffers
        input_buffer = np.empty(infer_model.input().shape, dtype=np.uint8)
        output_buffer_1 = np.empty(infer_model.output("output_1").shape, dtype=np.uint8)
        output_buffer_2 = np.empty(infer_model.output("output_2").shape, dtype=np.uint8)
        
        input_binding.set_buffer(input_buffer)
        output_binding_1.set_buffer(output_buffer_1)
        output_binding_2.set_buffer(output_buffer_2)
        
        # Run synchronous inference
        configured_infer_model.run([input_binding, output_binding_1, output_binding_2])
        
        # Get output buffers
        output_data_1 = output_binding_1.get_buffer()
        output_data_2 = output_binding_2.get_buffer()

For more info about the API and how to use it , Please check out the HailoRT documentation : ( Pages 278 - 284 )
https://hailo.ai/developer-zone/

@omria , As I am new to the hailo_platform package, could you point me to a more relevant repo for the same? I am sure I am doing something wrong which is why I am getting error,

File “/usr/lib/python3.10/site-packages/hailo_platform/pyhailort/pyhailort.py”, line 3198, in create_bindings
for input_name, buffer in input_buffers.items():
AttributeError: ‘str’ object has no attribute ‘items’

Note: I want something like Hailo-Application-Code-Examples/runtime/python/object_detection at 29bb76d9564e5759444a149b98fe4b913f34f657 · hailo-ai/Hailo-Application-Code-Examples · GitHub but with my model which has multiple outputs output_layer1, output_layer2, output_layer3 and output_layer4.

It seems that a post-processing function is needed to convert the output values from output_layer 1 to 4 of the model into the desired information.

Hi @dbiswal,

To support multiple outputs with the object_detection example, please change line 13 to:

from utils import HailoInference

And line 212 to:

    hailo_inference = HailoInference(args.net)

That is needed because the HailoAsyncInference currently only supports one output. Shortly, the class will be updated to support multiple outputs.

However, please note that the recommended way is to always use the hailort-postprocess with architectures that support it (yolo, ssd, centernet). And when using the hailort-postprocess there’s only one output.