Hey @whrltjr93 ,
Welcome to the Hailo Community!
Thank you for your question! Below is the information regarding the supported operators and frameworks for Hailo devices, specifically the Hailo-8L. This will address whether your model using STFT or GeLU activations will be compatible with the Hailo-8L NPU and provide potential workarounds if they are not natively supported.
1. Supported Operators on Hailo NPUs
Hailo devices, including the Hailo-8L, support a broad range of operations. As per the documentation:
-
Supported Activations:
- Mish, Hard-Swish, SiLU
- GeLU (preview)
- PReLU, Tanh, Exp, and Sqrt
-
Supported Operators:
- Convolutions, Depthwise Convolutions
- Element-wise operations: Add, Subtract, Multiply, Divide
- Pooling: Average Pooling (extended support)
- Other operators: Expand, ReduceL2, Square, InstanceNormalization
Unfortunately, STFT (Short-Time Fourier Transform) is not listed among the natively supported operations. However, Hailo’s flexible compiler might allow for customization or preprocessing to transform operations that aren’t directly supported into compatible formats.
2. Framework Compatibility
Hailo devices are compatible with multiple frameworks, including:
- TensorFlow / TensorFlow Lite
- PyTorch
- Keras and ONNX models
The framework you use (PyTorch, TensorFlow, or TFLite) won’t significantly affect compatibility, as long as the operations used are available after the model is converted and compiled using the Hailo Dataflow Compiler.
3. Workaround for Unsupported Operators
If certain operators like STFT are not natively supported, consider these options:
-
Preprocess STFT off-chip: Perform the STFT on the CPU or a separate processing unit, and feed the transformed data into the Hailo NPU for further inference.
-
Approximate Operators Using Supported Layers: GeLU, for example, can be approximated using other supported non-linear activations such as SiLU or Hard-Swish if required.
-
Custom Models and Tuning: With Hailo’s Dataflow Compiler, there is room to optimize or adapt unsupported layers into compatible operations during compilation. You may experiment with ONNX conversion or replacing unsupported blocks using TensorFlow Lite’s operators.
4. Summary and Recommendation
- GeLU is supported, though still in preview, meaning it is available but may undergo further optimization.
- STFT is not currently supported natively on Hailo-8L. It is recommended to preprocess it outside the NPU.
- Your use of PyTorch is fully supported, along with other frameworks such as TensorFlow and TFLite.
Let me know if further clarification is needed!
Best Regards,
Omri