[info] Model Optimization Algorithm Quantization-Aware Fine-Tuning is done (completion time is 00:00:25.47)
[info] Layer Noise Analysis skipped
[info] Model Optimization is done
[info] Saved HAR to: /local/workspace/direction_classifier.har
[info] Loading model script commands to direction_classifier from /local/workspace/hailo_model_zoo/hailo_model_zoo/cfg/alls/generic/direction_classifier.alls
[info] ParsedPerformanceParam command, setting optimization_level(max=2)
[info] Loading network parameters
[info] Starting Hailo allocation and compilation flow
[error] Mapping Failed (allocation time: 0s)
Can’t find mutual format for fc1_d1 → ew_add0_ew_add_n_fc1_d0
[error] Failed to produce compiled graph
[error] BackendAllocatorException: Compilation failed: Can’t find mutual format for fc1_d1 → ew_add0_ew_add_n_fc1_d0
(hailo_virtualenv) hailo@116af785a8fc:/local/workspace$
Can't find mutual format for fc1_d1 → ew_add0_ew_add_n_fc1_d0
This indicates our Dataflow Compiler’s allocator couldn’t resolve the memory layout requirements between your fc1_d1 layer and the downstream element-wise add node. Our hardware optimally runs most operators in native NHCW format, and when connected operators have conflicting layout expectations, the compiler needs to insert explicit format conversions - when it can’t, you get this “mutual format” failure.
Our compiler’s default behavior is to only insert format conversions at input layers (through input_conversion), not between internal graph nodes. To resolve this, you’ll want to use a model-script to explicitly force a conversion/transpose immediately after fc1_d1. This ensures its output layout aligns with the other input to your element-wise add operation.
I can help you set up the model-script syntax if you need it. Are you working with a custom model architecture or a standard framework export?
That’s just your local file path - I can’t see what’s actually in your model from that. Could you share the .alls file contents and maybe a screenshot from Netron or something similar so I can see what we’re working with?
fc1_d0: Shape (1, 128) - flat/linear tensor format
fc1_d1: Shape (1, 16, 8) - 3D tensor format (likely NHCW or NHWC)
You’re trying to do an element-wise addition between these two, but they have completely different shapes and memory layouts. The Hailo compiler can’t automatically figure out how to convert between them for internal nodes (it only does this automatically at model inputs).
How To Fix
You need to reshape and transposefc1_d1 to match fc1_d0’s flat format before the addition.
Updated Model Script
Here’s your fixed .alls script with the necessary changes:
Important: In your model graph, make sure the element-wise Add operation connects to:
fc1_d0 (unchanged)
fc1_d1_reshaped (NOT the original fc1_d1)
So it should look like: Add(fc1_d0, fc1_d1_reshaped)
Why This Works
Transpose first: This handles any row/column memory order issues between the layouts
Reshape second: This flattens the 3D tensor (1, 16, 8) into the required (1, 128) shape
Compiler restriction: Hailo only inserts automatic format conversions at model entry points, not between internal nodes - that’s why we need to do this explicitly
ocal/shared_with_docker/direction_classifier.yaml --hw-arch hailo8l
Start run for network direction_classifier …
Initializing the hailo8l runner…
[info] Translation started on Tensorflow model direction_classifier
[info] Start nodes mapped from original model: ‘serving_default_rescaling_input:0’: ‘direction_classifier/input_layer1’.
[info] End nodes mapped from original model: ‘StatefulPartitionedCall:0’.
[info] Translation completed on Tensorflow model direction_classifier (completion time: 00:00:00.13)
[info] Saved HAR to: /local/workspace/direction_classifier.har
Preparing calibration data…
[info] Loading model script commands to direction_classifier from /local/shared_with_docker/direction_classifier.alls
Traceback (most recent call last):
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/script_parser/model_script_parser.py”, line 381, in parse_script
script_grammar.parseString(input_script, parseAll=True)
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/pyparsing.py”, line 1955, in parseString
raise exc
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/pyparsing.py”, line 3814, in parseImpl
raise ParseException(instring, loc, self.errmsg, self)
pyparsing.ParseException: Expected end of text, found ‘#’ (at char 185), (line:4, col:56)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File “/local/workspace/hailo_virtualenv/bin/hailomz”, line 33, in
sys.exit(load_entry_point(‘hailo-model-zoo’, ‘console_scripts’, ‘hailomz’)())
File “/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py”, line 122, in main
run(args)
File “/local/workspace/hailo_model_zoo/hailo_model_zoo/main.py”, line 111, in run
return handlersargs.command
File “/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py”, line 248, in compile
_ensure_optimized(runner, logger, args, network_info)
File “/local/workspace/hailo_model_zoo/hailo_model_zoo/main_driver.py”, line 91, in _ensure_optimized
optimize_model(
File “/local/workspace/hailo_model_zoo/hailo_model_zoo/core/main_utils.py”, line 351, in optimize_model
optimize_full_precision_model(runner, calib_feed_callback, logger, model_script, resize, input_conversion, classes)
File “/local/workspace/hailo_model_zoo/hailo_model_zoo/core/main_utils.py”, line 315, in optimize_full_precision_model
runner.load_model_script(model_script)
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_common/states/states.py”, line 16, in wrapped_func
return func(self, *args, **kwargs)
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py”, line 497, in load_model_script
self._sdk_backend.load_model_script_from_file(model_script, append)
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py”, line 494, in load_model_script_from_file
self._script_parser.parse_script_from_file(model_script_path, nms_config, append)
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/script_parser/model_script_parser.py”, line 312, in parse_script_from_file
return self.parse_script(f.read(), append, nms_config_file)
File “/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/script_parser/model_script_parser.py”, line 389, in parse_script
raise BackendScriptParserException(f"Parsing failed at:\n{e.markInputline()}")
hailo_sdk_client.sdk_backend.sdk_backend_exceptions.BackendScriptParserException: Parsing failed at:
transpose(fc1_d1,perm=[0,2,1],name=“fc1_d1_transposed”)>!<#(1,16,8)→(1,8,16)