I developed a fire and smoke detection model using YOLOv8s. I have exported this model and currently possess it in ONNX format.
To compile this ONNX file into HEF, I installed the Hailo Docker on Ubuntu and attempted the compilation. The version information is as follows:
-
OS: Ubuntu 22.04
-
GPU Driver: 535
-
nvcc version: 12.5
-
cuDNN version: 9.10.2
In a previous topic I posted, we concluded that there was an issue with the GPU driver. Therefore, I performed a fresh install of Ubuntu, utilized the Hailo Docker, and ensured all software versions were installed exactly as per the guide.
Although the HEF compilation was successful, I did not obtain the desired results (e.g., incorrect labels were displayed, or bounding boxes were not generated for fire and smoke objects). I have re-exported the ONNX file to restart the process from the beginning.
I have two questions regarding this:
-
Is there a specific guide I can refer to for converting to HEF? I have checked various user guides on hailo.ai, including the “Building Model - Model Compilation” section in the DFC v3.33.0 guide and other documents, but I find them difficult to understand.
-
For compiling my custom model to HEF, would it be better to use the Hailo Docker, or is a manual conversion setup preferred? I have a dataset of 877 images prepared; however, when I use the
hailomz compilecommand, only 64 calibration images are utilized. This issue of the calibration set being limited to only 64 entries is something I have encountered before.
I would really appreciate a clear and definitive solution on how to successfully compile the HEF.