ParsingWithRecommendationException: Parsing failed. The errors found in the graph are:
UnsupportedShuffleLayerError in op /model.105/Reshape: Failed to determine type of layer to create in node /model.105/Reshape
UnsupportedShuffleLayerError in op /model.105/Transpose: Failed to determine type of layer to create in node /model.105/Transpose
UnsupportedFeatureSplitterLayerError in op /model.105/Split: Feature splitter vertex /model.105/Split is splitting input over unsupported axis 4
UnsupportedShuffleLayerError in op /model.105/Reshape_2: Failed to determine type of layer to create in node /model.105/Reshape_2
UnsupportedShuffleLayerError in op /model.105/Reshape_3: Failed to determine type of layer to create in node /model.105/Reshape_3
UnsupportedShuffleLayerError in op /model.105/Transpose_1: Failed to determine type of layer to create in node /model.105/Transpose_1
UnsupportedFeatureSplitterLayerError in op /model.105/Split_1: Feature splitter vertex /model.105/Split_1 is splitting input over unsupported axis 4
UnsupportedShuffleLayerError in op /model.105/Reshape_5: Failed to determine type of layer to create in node /model.105/Reshape_5
UnsupportedShuffleLayerError in op /model.105/Reshape_6: Failed to determine type of layer to create in node /model.105/Reshape_6
UnsupportedShuffleLayerError in op /model.105/Transpose_2: Failed to determine type of layer to create in node /model.105/Transpose_2
UnsupportedFeatureSplitterLayerError in op /model.105/Split_2: Feature splitter vertex /model.105/Split_2 is splitting input over unsupported axis 4
UnsupportedShuffleLayerError in op /model.105/Reshape_8: Failed to determine type of layer to create in node /model.105/Reshape_8
Please try to parse the model again, using these end node names: /model.105/m.0/Conv, /model.105/m.2/Conv, /model.105/m.1/Conv```
I set the architecture as “hailo8l” and use the parser to convert from onnx to .haf - Yet I get a similar error:
ParsingWithRecommendationException: Parsing failed. The errors found in the graph are:
UnsupportedShuffleLayerError in op Reshape_303: Failed to determine type of layer to create in node Reshape_303
UnsupportedShuffleLayerError in op Transpose_304: Failed to determine type of layer to create in node Transpose_304
Please try to parse the model again, using these end node names: Conv_302
Please have a look at the last line of the error message. You will need to use the end node names suggested in the error message not “output”.
You will need to ensure you get all outputs of the model not just Conv_302.
Here are a few more pointers to get a better understanding.
Tutorial
Please go trough the tutorial on how to convert a model. In the Hailo AI Software suite docker call
hailo tutorial
Note the start and end node names; they are important. Some models contain layers at the beginning and/or end of the model that need to be excluded during the parsing step and implemented on the host.
Model Zoo CLI
Have a look at the model in the Hailo Model Zoo. First use the Model Zoo CLI. Inside the Hailo AI Software suite docker run:
cd /local/shared_with_docker/
hailomz parse yolov7
This will download the original ONNX file and parse it into a har file. You can find the original in the /local/shared_with_docker/.hailomz/model_files/ObjectDetection/ directory.
Now open Netron and open the ONNX and the HAR file.
You will see that the Reshape and Transpose layers are part of the ONNX but not included in the HAR file.
Model Zoo YAML
The Model Zoo uses YAML files to store the information needed to parse a model. This can be a useful hint on how to parse a model. Have a look at the file for yolov7 and compare it with the ONNX graph in Netron.
You will see the end node names. Note that the end node names can be slightly different when you use a variant from a different source. You can use Netron to determine the names for your specific model.
With respect to the original model we tried I understand the error message, the issue is that there are a lot more layers in the .onnx model that follow after the /model.105/m.0/Conv layer (see image
The Netron graph can be a bit misleading. The number of layers is not necessarily a reflection of the computation needed. Some boxes require a large amount of computation while in other cases a many boxes together only require very few operations.
AI frameworks have been developed for CPUs and GPUs and not for AI accelerators alone. They can describe operators that you would not want to execute on the Hailo device. They are mostly at the beginning and end of a network. We call it pre- and post-processing. These need to be implemented and executed on the host CPU.
We do provide some of the post processing code for popular networks in Tappas Postprocessing and our Application Code Runtime Examples. In some cases we even run that code under HailoRT. e.g. the NMS for Yolo can be part of the HEF that runs on the host. Have a look at the model script for Yolov7. It contains an instruction to add the NMS.
I understand that the last layers are post-processing and not heavy convolutions. The issue is that we are looking at moving from the google coral platform to Hailo for our devices. We have 3 models we run in a pipeline on a raspberry pi and are looking to compare over all performance.
We are also open to retraining with a new object detector model or if you can link us to an example that runs a custom model and adds the postprocessing nodes after.