HEF format is not compatible with device. Device arch: HAILO8L, HEF arch: HAILO8

I am getting the following error when I run inference on Hailo-8L model from the following github repo.

HAILO MODELS ZOO

[HailoRT] [error] HEF format is not compatible with device. Device arch: HAILO8L, HEF arch: HAILO8
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_HEF(26)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_HEF(26)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INVALID_HEF(26)
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/hailo_platform/pyhailort/pyhailort.py", line 2847, in configure
    configured_infer_model_cpp_obj = self._infer_model.configure()
                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
hailo_platform.pyhailort._pyhailort.HailoRTStatusException: 26

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/suraas/Desktop/face_detection/arcface.py", line 16, in <module>
    with infer_model.configure() as configured_infer_model:
         ^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3/dist-packages/hailo_platform/pyhailort/pyhailort.py", line 2843, in configure
    with ExceptionWrapper():
  File "/usr/lib/python3/dist-packages/hailo_platform/pyhailort/pyhailort.py", line 111, in __exit__
    self._raise_indicative_status_exception(value)
  File "/usr/lib/python3/dist-packages/hailo_platform/pyhailort/pyhailort.py", line 156, in _raise_indicative_status_exception
    raise self.create_exception_from_status(error_code) from libhailort_exception
hailo_platform.pyhailort.pyhailort.HailoRTInvalidHEFException: Invalid HEF. See hailort.log for more information

It clearly is under the folder 8L but when I download the model and run the inference it give me the above error. Here is my code used to test the inference.

import numpy as np
from hailo_platform import VDevice, HailoSchedulingAlgorithm, HEF
import cv2

timeout_ms = 1000

params = VDevice.create_params()
params.scheduling_algorithm = HailoSchedulingAlgorithm.ROUND_ROBIN

with VDevice(params) as vdevice:

    # Create an infer model from an HEF:
    infer_model = vdevice.create_infer_model('/home/suraas/Desktop/face_detection/arcface_r50.hef')

    # Configure the infer model and create bindings for it
    with infer_model.configure() as configured_infer_model:
        bindings = configured_infer_model.create_bindings()

        # Load and preprocess image
        # Load image
        image = cv2.imread('/home/suraas/Desktop/face_detection/images/sample2.png')
        # Resize image to match model's expected input shape
        resized_image = cv2.resize(image, (112, 112))  # Note: cv2.resize takes (width, height)
        # Convert to appropriate format (usually float32 or uint8 depending on model)
        input_tensor = resized_image.astype(np.uint8)
        # Set the preprocessed image as input
        bindings.input().set_buffer(input_tensor)

        # Get all output tensors
        output_tensors = infer_model.outputs

        # Print output shapes for debugging
        for output in output_tensors:
            print(f"Output '{output.name}' shape: {output.shape}")

        # Create buffers for each output
        output_buffers = {}
        for output in output_tensors:
            # Create buffer with exact shape and use uint8 dtype
            buffer = np.zeros(output.shape, dtype=np.uint8)
            bindings.output(output.name).set_buffer(buffer)
            output_buffers[output.name] = buffer

        # Run synchronous inference
        configured_infer_model.run([bindings], timeout_ms)

        # Get results from each output buffer
        results = {}
        for output_name, buffer in output_buffers.items():
            results[output_name] = bindings.output(output_name).get_buffer()
        
        # For async inference
        job = configured_infer_model.run_async([bindings])
        job.wait(timeout_ms)

Hi @hyperwolf,

The Hailo Model Zoo master branch has been updated to version 2.14.0 due to the upcoming Software Suite release (2025-01). Therefore, some models are not available for Hailo-8L yet. Apologies for the inconvenience.
Please get the model from the Model Zoo version 2.13.0, which is compatible with HailoRT 4.19.0.
You can use hailortcli parse-hef command to verify the architecture the HEF
was compiled for. For example, using v2.13 arcface_r50 model:

hailortcli parse-hef arcface_r50.hef 
Architecture HEF was compiled for: HAILO8L
Network group name: arcface_r50, Multi Context - Number of contexts: 6
    Network name: arcface_r50/arcface_r50
        VStream infos:
            Input  arcface_r50/input_layer1 UINT8, NHWC(112x112x3)
            Output arcface_r50/fc1 UINT8, NC(512)