Cannot compile any darknet YOLO models - even Hailo's own license plate detector

Hi @i_j
We will take a look and see if we can help. Any details you can provide on model architecture or source repo?

Just to clarify: we are not from Hailo but from DeGirum which is a SW company building PySDK that helps in developing applications on Hailo devices.

1 Like

Much appreciated, thanks @shashi! Here is the source repo with the model architecture:

@i_j

I recommend setting the end node name to /model/model.7/Conv and performing the postprocessing off chip.

Something like this:

# Helper functions
def _softmax(x, axis=0):
    e_x = np.exp(x - np.max(x, axis=axis, keepdims=True))
    return e_x / e_x.sum(axis=axis, keepdims=True)

def _sigmoid(x):
    return 1 / (1 + np.exp(-x))

# Constants
anchor_w = 0.04250100255012512
anchor_h = 0.05551774054765701

Sy = 129
Sx = 97

_Cxs = np.tile(np.linspace(0, 1 - 1 / Sx, Sx), (Sy, 1))
_Cys = np.tile(np.linspace(0, 1 - 1 / Sy, Sy).reshape(-1, 1), (1, Sx))

# Postprocessing
x = YOUR_HAILO_RESULT_FROM_INFERENCE

classification = _softmax(x[:, 5:, :, :], axis=1)
clamped_whs = np.clip(x[:, 2:4, :, :], None, 80)

result_tensor = np.concatenate(
    (
        ((1 / Sx) * _sigmoid(x[:, 0, :, :]) + _Cxs)[:, None, :, :],
        ((1 / Sy) * _sigmoid(x[:, 1, :, :]) + _Cys)[:, None, :, :],
        (
            anchor_w
            * np.exp(clamped_whs[:, 0:1, :, :])
        ),
        (
            anchor_h
            * np.exp(clamped_whs[:, 1:2, :, :])
        ),
        (_sigmoid(x[:, 4, :, :]))[:, None, :, :],
        *np.split(classification, classification.shape[1], axis=1),
    ),
    axis=1,
)

Hi @i_j
Just checking if you had a chance to try the suggestion by @lawrence

Hi @shashi and @lawrence - thank you both for your help! Stripping the model of its last layers as recommended allows it to compile to .har.

I have not yet had a chance to ensure that there is performance parity with the trimmed model + post-processing vs. the original full model. I will post results here once I have them. A big thanks to you both again!