Hello,
i have a simple question .
Is Hailort monitor not supported in Windows OS?
Hello,
i have a simple question .
Is Hailort monitor not supported in Windows OS?
Hey @Yu_Eunsang ,
Welcome to the Hailo Community!
You’re correct— the hailortcli monitor
command is only supported on Linux.
@omria , Thank you for your quick response
Then, is there any other tool to monitor Hailo on Windows OS?
I want to see the NPU usage during the Inferencing Test.
Should I get it Through the Hailo API?
Best Regards
Hey @Yu_Eunsang,
Yep, I’d definitely recommend going this route for monitoring NPU utilization during your inference runs.
Both our C/C++ and Python APIs have built-in runtime statistics that you can poll directly from your application. Here’s a quick Python example of how I’ve been doing it:
from hailo_platform.pyhailort import VDeviceParams, VDevice, InferModelParams, ConfiguredInferModel
# After you've loaded your HEF and set up your VDevice
device = VDevice(VDeviceParams(device_ids=[0], scheduler=True))
model = ConfiguredInferModel(device, InferModelParams(hef_path="your_model.hef"))
# Run your inferences
outputs = model.run(inputs)
# Then grab the stats
stats = device.get_statistics()
print(f"Device Utilization: {stats.device_utilization}%")
print(f"Model Utilization: {stats.model_utilization}%")
If you’re working with the C API instead, there are equivalent functions in the “runtime statistics” section - stuff like hailo_vdevice_get_runtime_statistics()
that’ll give you the same metrics.
For more detailed info, check out the HailoRT User Guide - specifically the Runtime statistics section starting around page 229 for the C++ API, and pages 365-367 for the Python API documentation.