Inference Fails with Network Group Not Activated (Hailo-8L)

Hi Hailo team,

I’ve been trying to run inference on a Raspberry Pi 5 with the Hailo-8L using both my own .hef and the officially provided cas_vit_s.hef, but I consistently receive the following error:

[HailoRT] [error] Trying to write to vstream before its network group is activated
HailoRTNetworkGroupNotActivatedException

Here’s what I’ve confirmed:
:white_check_mark: My .hef file is compiled for HAILO8L, verified via hailortcli parse-hef
:white_check_mark: I’m using a known-good .hef (cas_vit_s.hef) from the Model Zoo
:white_check_mark: I’m calling network_group.activate() before creating InferVStreams
:white_check_mark: I’m passing a uint8, contiguous NumPy tensor of the correct size (e.g., (384, 384, 384, 3) for cas_vit_s)
:white_check_mark: I’ve tested using a minimal script (see below), rebooted
:white_check_mark: Still, inference fails with the same activation error

from pathlib import Path
import numpy as np
from hailo_platform import (
    HEF, VDevice, ConfigureParams, HailoStreamInterface,
    InputVStreamParams, OutputVStreamParams, InferVStreams
)

hef = HEF("cas_vit_s.hef")
vdev = VDevice()
config = ConfigureParams.create_from_hef(hef, interface=HailoStreamInterface.PCIe)
network_group = vdev.configure(hef, config)[0]

network_group.activate()

input_params = InputVStreamParams.make(network_group)
output_params = OutputVStreamParams.make(network_group)

input_name = list(input_params.keys())[0]
output_name = list(output_params.keys())[0]

dummy_input = np.random.randint(0, 255, size=(384, 384, 384, 3), dtype=np.uint8)
dummy_input = np.ascontiguousarray(dummy_input)

with InferVStreams(network_group, input_params, output_params) as pipeline:
    output = pipeline.infer({input_name: dummy_input})

Any idea what might be causing this consistent activation error, even in a minimal test?

Thanks so much for your help — and for this great platform.

I think I found the issue. I had missed this: with network_group.activate():. So this script now runs inference:

from pathlib import Path

import numpy as np  # type: ignore
from hailo_platform import (  # type: ignore
    HEF,
    ConfigureParams,
    HailoStreamInterface,
    InferVStreams,
    InputVStreamParams,
    OutputVStreamParams,
    VDevice,
)

# Load known-good Hailo-8L model
hef_path = Path("cas_vit_s.hef")
if not hef_path.exists():
    raise FileNotFoundError(hef_path)

hef = HEF(str(hef_path))
vdev = VDevice()
config = ConfigureParams.create_from_hef(hef, interface=HailoStreamInterface.PCIe)
network_group = vdev.configure(hef, config)[0]

input_params = InputVStreamParams.make(network_group)
output_params = OutputVStreamParams.make(network_group)

input_name = list(input_params.keys())[0]
output_name = list(output_params.keys())[0]

# Create dummy input: batch of 384 NHWC images (384x384x3)
dummy_input = np.random.randint(0, 255, size=(384, 384, 384, 3), dtype=np.uint8)
dummy_input = np.ascontiguousarray(dummy_input)

with network_group.activate():
    with InferVStreams(network_group, input_params, output_params) as pipeline:
        output = pipeline.infer({input_name: dummy_input})
        result = output[output_name]

print("✅ Inference succeeded")
print("Result shape:", result.shape)
print("Sample output:", result[0][:10])