Hi, all:
I try to convert a phi-2 model to onnx according to Phi-2 tutorial | onnxruntime. The onnx file was successfully generated and tested.
When I tried to convert the model.onnx to HAR, the hailo parser failed to support those functions defined in onnexruntime as shown below. Was it a limitation or I did it in a wrong way? Thanks!
[Note] the virtual environment is the 2025/4 version.
[info] Current Time: 15:14:28, 04/29/25 [info] CPU: Architecture: x86_64, Model: Intel(R) Core(TM) i7-8700 CPU @ 3.20GHz, Number Of Cores: 12, Utilization: 1.1% [info] Memory: Total: 31GB, Available: 25GB [info] System info: OS: Linux, Kernel: 6.8.0-57-generic [info] Hailo DFC Version: 3.31.0 [info] HailoRT Version: 4.21.0 [info] PCIe: 0000:03:00.0: Number Of Lanes: 4, Speed: 8.0 GT/s PCIe [info] PCIe: 0000:04:00.0: Number Of Lanes: 4, Speed: 8.0 GT/s PCIe [info] Running `hailo parser onnx --har-path ./har --hw-arch hailo8l ./example-models/phi2-int4-cpu/model.onnx` [info] Translation started on ONNX model model [warning] Large model detected. The graph may contain either a large number of operators, or weight variables with a very large capacity. [warning] Translation time may be a bit long, and some features may be disabled (e.g. model augmentation, retry simplified model, onnx runtime hailo model extraction, etc.). Traceback (most recent call last): File "/local/workspace/hailo_virtualenv/bin/hailo", line 8, in <module> sys.exit(main()) File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/tools/cmd_utils/main.py", line 111, in main ret_val = client_command_runner.run() File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_platform/tools/hailocli/main.py", line 64, in run return self._run(argv) File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_platform/tools/hailocli/main.py", line 104, in _run return args.func(args) File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/tools/parser_cli.py", line 213, in run self._parse(net_name, args, tensor_shapes) File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/tools/parser_cli.py", line 297, in _parse self.runner.translate_onnx_model( File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_common/states/states.py", line 16, in wrapped_func return func(self, *args, **kwargs) File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py", line 1187, in translate_onnx_model parser.translate_onnx_model( File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/parser/parser.py", line 194, in translate_onnx_model onnx.checker.check_model(model) File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/onnx/checker.py", line 163, in check_model C.check_model_path( onnx.onnx_cpp2py_export.checker.ValidationError: No Op registered for LayerNormalization with domain_version of 14 ==> Context: Bad node spec for node. Name: /model/layers.0/input_layernorm/LayerNorm OpType: LayerNormalization