Equalization for Non-First Order Homogeneous Activation Functions

Hi,

According to the documentation of the Dataflow compiler, the equalization algorithm is used in quantization at any optimization level. By the way, for equalization to balance the model weights without altering the inference results, the activation function A must be first-order homogeneous , meaning for any real number c, it must satisfy A(cx)=cA(x)

So, if the activation function does not satisfy the first-order homogeneous condition, such as GELU, is there any process in place to ensure that equalization works correctly?

Hi @koki.igari,
This is a good question. Unlike ReLU, which is linear, for activation such as GeLU that do not satisfy the first-order homogeneous condition you mentioned, it is problematic to apply equalization. Therefore, the toolchain automatically disables equalization for layers such as these, so you do not need to take care of these layers yourself manually for running the Hailo optimization.

Regards,

2 Likes

Thank you for your response. As I checked the profiler, I could indeed confirm that equalization is automatically disabled.