Project Goal:
We are building a workflow to:
Process numerical data extracted from an SQLite database .
Analyze the data using Machine Learning models optimized for the Hailo AI Accelerator .
Save the results back into the SQLite database.
Current Status:
We have successfully implemented and tested the pipeline using a TensorFlow-based dummy model for processing data in batches.
HailoRT 4.19.0 is installed and recognizes the Hailo device (Device(‘0000:01:00.0’) ).
Benchmarks show excellent performance with an existing HEF file (image-processing).
However, the current HEF model is image-based and incompatible with our numerical data workflow .
Challenges Faced:
Compiler Missing:
The installed HailoRT version (4.19.0) does not include the TensorFlow-to-HEF compiler .
Commands like:
python
Code kopieren
hailortcli compile-tensorflow
→ Are not available.
2. Hailo Model Zoo Installation Issue:
Attempting to install the Hailo Model Zoo 2.13.0 (Python package) in our Python 3.11 environment fails with:
yaml
Code kopieren
fatal error: longintrepr.h: No such file or directory
The lap dependency seems incompatible with Python 3.11 .
Workflow Model Compatibility:
Current VStreams in the example HEF are image-based (NHWC 416x416x3 ) and unsuitable for tabular input like our numerical features (3 values per row ).
Unsure how to adapt the pipeline for numerical data processing.
Questions for the Community:
Is there an example or template for integrating numerical data models (e.g., DNNs) with the Hailo AI Accelerator?
Should we downgrade to Python 3.9 for compatibility with the Model Zoo , or is there an alternative?
Are there any recommended workflows or compiler tools for non-image-based processing?
Can TAPPAS be used to streamline this kind of numerical data pipeline?
Additional Info:
Raspberry Pi 5 running 64-bit Raspberry Pi OS .
TensorFlow 2.18.0 installed and working in a virtual environment.
SQLite database for input and output, with batch processing already validated.
Any advice, examples, or insights would be greatly appreciated!
omria
January 12, 2025, 1:57pm
2
Hey @walter.richtscheid ,
Welcome to the Hailo Community !
Let me provide a comprehensive solution for processing numerical data with the Hailo AI Accelerator:
First, download the Dataflow Compiler (DFC) from our developer zone:
https://hailo.ai/developer-zone/software-downloads/
(You’ll need to register/login to access the download)
For Model Compilation (on x86 machine with GPU):
Convert your TensorFlow model to ONNX format:
python -m tf2onnx.convert --saved-model your_model_dir --output model.onnx
hailo parser model.onnx --parsing-report-path parser.log
hailo optimize model.har --calib-path calib_data/
hailo compile model.har --hef model.hef
Here’s a complete pipeline implementation for your numerical data processing:
import numpy as np
import sqlite3
from hailo_platform import HEF, VDevice
from hailo_rpi_common import app_callback_class
class NumericalDataCallback(app_callback_class):
def __init__(self):
super().__init__()
self.db_path = "results.db"
def save_to_db(self, data):
conn = sqlite3.connect(self.db_path)
cursor = conn.cursor()
cursor.execute('CREATE TABLE IF NOT EXISTS results (id INTEGER PRIMARY KEY, value REAL)')
cursor.executemany('INSERT INTO results (value) VALUES (?)', [(value,) for value in data])
conn.commit()
conn.close()
class NumericalDataApp:
def __init__(self, callback, user_data):
self.callback = callback
self.user_data = user_data
self.hef_path = "model.hef"
with VDevice() as vdevice:
hef = HEF(self.hef_path)
self.network_group = vdevice.configure(hef)
def run_inference(self, data):
with self.network_group.create_vstreams() as vstreams:
return vstreams[0].infer(data)
def run(self):
input_data = np.random.rand(1, 3).astype(np.float32) # Replace with your actual data
results = self.run_inference(input_data)
self.user_data.save_to_db(results)
print(f"Pipeline run complete. Results saved: {results}")
if __name__ == "__main__":
user_data = NumericalDataCallback()
app = NumericalDataApp(app_callback, user_data)
app.run()
For reference implementations, check our RPI examples which can be adapted for numerical data processing.
Contribute to hailo-ai/hailo-rpi5-examples development by creating an account on GitHub.
Let me know if you need any clarification! Can’t wait to see the project working !
Best Regards,
Omria
Hi @omria ,
Thank you so much for your detailed response and the clear step-by-step guidance for processing numerical data with the Hailo AI Accelerator. We truly appreciate the time and effort you put into sharing this comprehensive solution!
We’re currently organizing our workflow and preparing to implement the suggested steps, including the Dataflow Compiler and the Python-based pipeline. Once we’ve tested everything, we’ll make sure to share our results and any additional insights with the community.
Thanks again for your support and expertise – it’s invaluable as we continue to build this project.
Best regards,
Walter