An error while converting file from onnx to har or hef

Hey everyone, So I’ve been trying to convert this onnx file to hef or har, but whenever I try doing it, I keep getting an error saying that the hailo packages are not there. And when I check all the packages, things like the hailort, compiler, etc are installed, but this one command hailo_convert doesn’t exist, I checked chatgpt and other online forums, but couldn’t resolve this issue, would really appreiciate it if any of u could help with this, thanks!!

File and file paths:

  • python source code, compiler (whl file), hailort (whl file), ai software suite (run file) :- All r in: Home/Downloads
  • hailort (deb file), hailo_venv (Virtual environment), the tflite and onnx file obtained from the python code:- All r in Home/Downloads/software_ai_sw_suite
  • COMPRESSED (File containing test, train and valid folders with the xml and jpg image data): All in Home/Trial

Hey @Shannen_Milton ,

Welcome to the Hailo Community!

I noticed there might be some confusion with the command you’re using. For DFC operations, you should use the hailo command rather than hailo_convert in the CLI. If you’re trying to run this from a Python script, could you share it so I can provide more specific guidance? Also, please ensure you’re working within the virtual environment where all the necessary packages are installed.

Thanks for replying back, and idk why but the compiler keeps giving me the same error even now, And yes, Im using a python script (I managed to train the data, but yea…not able to convert it to hef). Will add it below

File path:
/home/ubuntu/optpascaltotflite_part2.py

import os
import datetime

from tflite_model_maker.config import ExportFormat
from tflite_model_maker import model_spec
from tflite_model_maker import object_detector

import tensorflow as tf
assert tf.__version__.startswith('2')

tf.get_logger().setLevel('ERROR')
from absl import logging
logging.set_verbosity(logging.ERROR)

spec = model_spec.get('efficientdet_lite0')

# Specify your data directories (updated paths)
train_image_dir = '/home/ubuntu/Downloads/COMPRESSED/train'
train_annotations_dir = '/home/ubuntu/Downloads/COMPRESSED/train'
validation_image_dir = '/home/ubuntu/Downloads/COMPRESSED/valid'
validation_annotations_dir = '/home/ubuntu/Downloads/COMPRESSED/valid'
test_image_dir = '/home/ubuntu/Downloads/COMPRESSED/test'
test_annotations_dir = '/home/ubuntu/Downloads/COMPRESSED/test'

# Load data using from_pascal_voc method
train_data = object_detector.DataLoader.from_pascal_voc(
    train_image_dir, train_annotations_dir, label_map={1: "person"}
)
validation_data = object_detector.DataLoader.from_pascal_voc(
    validation_image_dir, validation_annotations_dir, label_map={1: "person"}
)
test_data = object_detector.DataLoader.from_pascal_voc(
    test_image_dir, test_annotations_dir, label_map={1: "person"}
)

# Create and train the object detection model
print("**********************Training Model*********************************")
print("Timestamp", datetime.datetime.now())

model = object_detector.create(
    train_data,
    model_spec=spec,
    epochs=150,
    batch_size=10,
    train_whole_model=True,
    validation_data=validation_data
)

# Evaluate the model
print("**********************Evaluating Model*********************************")
print("Timestamp", datetime.datetime.now())

model.evaluate(validation_data)

# Export the trained model to TFLite format
print("**********************Exporting to TFLite*********************************")
print("Timestamp", datetime.datetime.now())

modelname = 'detect_Rasp5_fishery_906img.tflite'
model.export(export_dir='.', tflite_filename=modelname)

# Evaluate the exported TFLite model on the test set
print("**********************Evaluating TFLite Model on Test Set*********************************")
print("Timestamp", datetime.datetime.now())

model.evaluate_tflite(modelname, test_data)