Hi. I’m stuck, not really sure where to go next.
With Tensorflow I’m doing this
new_model = load_model(os.path.join('imageclassifier.keras'))
img = cv2.imread('Adailer.C.png')
resize = tf.image.resize(img, (256,256))
plt.imshow(img)
plt.show
yhat = new_model.predict(np.expand_dims(resize/255, 0))
yhat
print(
"This image most likely belongs to {} with a {:.2f} percent confidence."
.format(class_names[np.argmax(yhat)], 100 * np.max(yhat))
)
I now want to do the same with Hailo8l on the same raspberry pi but with Hailo.
I’ve set up the docker hailo on a Ubuntu Server and created a HAR file.
But after this, I have no idea what I’m meant to be doing.
What are the next steps please?
Convert HAR to HEF: You can convert your HAR file to a HEF using the hailo compile and hailo optimize commands. For detailed guidance, refer to the DFC guideline at Hailo Developer Zone. For a simplified explanation, you can check this post: Creating Custom HEF Using DFC Model Zoo.
Run Inference: You can run inference using one of the following methods:
HailoRT CLI: Use the command: hailortcli run --hef your_model.hef --input your_input_image --output your_output_dir
HailoRT Python API: This allows you to configure the device and run inference programmatically.
For more information on using this in an application, check out these examples: