Hi everyone! I successfully converted my model from .onnx to .hef and right now I want to write some C++ inference code that will engage Hailo chip. But the problem is: I don’t have physical Hailo device in my possession, hence I can’t just write inference code and test it.
I’ve read that DFC has Emulator inside, and from what I understood, is this tool a privilege for Python developers only? If I am mistaken and there is some guidance on how to use Hailo Emulator in C++ code, I would greatly appreciate it.
BTW, I have compiled HailoRT libraries and I found there is a flag HAILO_BUILD_EMULATOR which I put to ON. But I am not sure what exactly changed… I was expecting that maybe some additional .so will be presented, but build folder looks exactly the same.
Anyway, guys, please let me know if is there a way to write C++ code ad actually debug it without having Hailo in physical possession. Thank you in advance!
Hi @user59
At DeGirum we have an AI Hub that lets you run models on the devices we host in the cloud so that you can write applications on Hailo without actually having a physical Hailo device with you. You can check our AI Hub at DeGirum AI Hub. However, our SDK is python based. Please see if you want to use it and we can help you put your model in our hub so that you can run and test it.
Yes! You can use the Hailo Emulator for C++ inference, not just Python.
1. How It Works
The HAILO_BUILD_EMULATOR flag you enabled builds the emulator directly into the HailoRT library, allowing software-based inference without a physical device.
2. C++ Implementation
Here’s how to use it in your C++ code:
#include "hailo/hailort.hpp"
#include <iostream>
int main() {
try {
// Create a Virtual Device (Emulator)
auto vdevice = hailort::VDevice::create();
if (!vdevice) {
std::cerr << "Failed to create virtual device (emulator)." << std::endl;
return -1;
}
// Load the HEF file
auto hef = hailort::Hef::create("your_model.hef");
// Configure the virtual device with the HEF file
auto network_group = vdevice->configure(*hef);
std::cout << "Successfully set up inference on Hailo Emulator!" << std::endl;
}
catch (const std::exception &e) {
std::cerr << "Exception: " << e.what() << std::endl;
}
return 0;
}
After that I ran hailortcli scan and here is the output:
Hailo devices not found
So unfortunately, no “Virtual device” appeared. I decided to run one of examples just to reaffirm that I don’t have a Virtual device
./cpp_async_infer_advanced_example
[HailoRT] [error] CHECK failed - Failed to create vdevice. there are not enough free devices. requested: 1, found: 0
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_OUT_OF_PHYSICAL_DEVICES(74)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_OUT_OF_PHYSICAL_DEVICES(74)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_OUT_OF_PHYSICAL_DEVICES(74)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_OUT_OF_PHYSICAL_DEVICES(74)
Failed to run inference. status=74, error message: Expected::expect() failed with status=74. Failed create vdevice
So I think it’s apparent, that my HailoRT does not have emulator. Could you please point me what am I doing wrong?
Also a quick question regarding HAILO_BUILD_EMULATOR flag. I checked what it does, and I don’t see any code that can relate into spawning a “Virtual Device” on my machine. I see that it sets HAILO_EMULATOR to ON, but it doesn’t seem to do much except Warning messages and adjusting DEFAULT_TRANSFER_TIMEOUT. Is it possible that I am building not what I expect to build? Just for the record I am compiling HailoRT using this page, paragraph “Compiling HailoRT from Sources”