Optimize Problem

I am writing a script to optimize my model, and an error raised:
(hailodfc) (base) ykl@ykl-virtual-machine:~/hailodfc$ /home/ykl/hailodfc/bin/python /home/ykl/hailodfc/paser4.py
[info] ParsedPerformanceParam command, setting optimization_level(max=2)
[info] Loading model script commands to Aod-Net from string
[info] ParsedPerformanceParam command, setting optimization_level(max=2)
Calibration data shape: (1449, 450, 600, 3), range: 0.0 - 255.0
[info] Starting Model Optimization
[warning] Reducing optimization level to 0 (the accuracy won’t be optimized and compression won’t be used) because there’s no available GPU
[warning] Running model optimization with zero level of optimization is not recommended for production use and might lead to suboptimal accuracy results
[info] Model received quantization params from the hn
[info] MatmulDecompose skipped
[info] Starting Mixed Precision
[info] Model Optimization Algorithm Mixed Precision is done (completion time is 00:00:00.08)
[info] Remove layer Aod-Net/conv2 because it has no effect on the network output
[info] Remove layer Aod-Net/concat2 because it has no effect on the network output
[info] Remove layer Aod-Net/conv4 because it has no effect on the network output
Traceback (most recent call last):

  • File “/home/ykl/hailodfc/paser4.py”, line 42, in *
  • runner.optimize(calib_data_np)*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_sdk_common/states/states.py”, line 16, in wrapped_func*
  • return func(self, *args, *kwargs)
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py”, line 2128, in optimize*
  • self._optimize(calib_data, data_type=data_type, work_dir=work_dir)*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_sdk_common/states/states.py”, line 16, in wrapped_func*
  • return func(self, *args, *kwargs)
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py”, line 1970, in _optimize*
  • self._sdk_backend.full_quantization(calib_data, data_type=data_type, work_dir=work_dir)*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py”, line 1125, in full_quantization*
  • self._full_acceleras_run(self.calibration_data, data_type)*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py”, line 1319, in _full_acceleras_run*
  • optimization_flow.run()*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/tools/orchestator.py”, line 306, in wrapper*
  • return func(self, *args, *kwargs)
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/flows/optimization_flow.py”, line 335, in run*
  • step_func()*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/tools/orchestator.py”, line 250, in wrapped*
  • result = method(*args, *kwargs)
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/tools/subprocess_wrapper.py”, line 124, in parent_wrapper*
  • func(self, *args, *kwargs)
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/flows/optimization_flow.py”, line 351, in step1*
  • self.pre_quantization_structural()*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/tools/orchestator.py”, line 250, in wrapped*
  • result = method(*args, *kwargs)
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/flows/optimization_flow.py”, line 384, in pre_quantization_structural*
  • self._remove_dead_layers()*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/tools/orchestator.py”, line 250, in wrapped*
  • result = method(*args, *kwargs)
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/flows/optimization_flow.py”, line 512, in _remove_dead_layers*
  • algo.run()*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/algorithms/optimization_algorithm.py”, line 54, in run*
  • return super().run()*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/algorithms/algorithm_base.py”, line 150, in run*
  • self._run_int()*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/algorithms/dead_layers_removal.py”, line 79, in _run_int*
  • new_input, new_output = self._infer_model_random_data(ref_input)*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/algorithms/dead_layers_removal.py”, line 115, in _infer_model_random_data*
  • output = self._model(random_input)*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/keras/utils/traceback_utils.py”, line 70, in error_handler*
  • raise e.with_traceback(filtered_tb) from None*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/acceleras/utils/distributed_utils.py”, line 122, in wrapper*
  • res = func(self, *args, *kwargs)
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/acceleras/model/hailo_model/hailo_model.py”, line 1153, in build*
  • self.compute_output_shape(input_shape)*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/acceleras/model/hailo_model/hailo_model.py”, line 1091, in compute_output_shape*
  • return self.compute_and_verify_output_shape(input_shape, verify_layer_inputs_shape=False)*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/acceleras/model/hailo_model/hailo_model.py”, line 1125, in compute_and_verify_output_shape*
  • layer_output_shape = layer.compute_output_shape(layer_input_shapes)*
  • File “/home/ykl/hailodfc/lib/python3.10/site-packages/hailo_model_optimization/acceleras/hailo_layers/base_hailo_layer.py”, line 1572, in compute_output_shape*
  • raise ValueError(*
    ValueError: Inputs and input nodes not the same length in layer Aod-Net/concat1 - inputs: 4, nodes: 2
    So,how can I solve the problem?

Hey @olive_michael ,

You’re hitting this error during model optimization:

ValueError: Inputs and input nodes not the same length in layer Aod-Net/concat1 - inputs: 4, nodes: 2

This happens when the DFC optimization tool tries to remove unused (“dead”) layers. In your case, it’s getting confused about the concat1 layer — it’s expecting 2 inputs, but for some reason it’s receiving 4.

Why This Happens

When layers are pruned during optimization (like conv2, concat2, or conv4), the tool tries to reconnect what’s left. Sometimes it doesn’t do that cleanly, and you end up with layers wired incorrectly — like concat1 thinking it has more inputs than it should.

How to Fix It

1. Double-check the inputs to concat1

Open your model in something like Netron or use model.summary() (if you’re using Keras) and confirm that concat1 is only receiving two inputs. In code, that should look something like:

concat1 = Concatenate()([input1, input2])

If there are more than two inputs listed there, or the wrong ones, that’s your problem.

2. Turn Off Dead Layer Removal

Since the issue is happening during that specific optimization step, a quick fix is to disable it. In your script (probably paser4.py), before runner.optimize(), add:

runner.set_optimization_param('enable_dead_layers_removal', False)

3. Manually Set Optimization Level

You’re seeing a warning about the optimization level being reduced because there’s no GPU:

[warning] Reducing optimization level to 0 ...

Go ahead and set it explicitly to avoid any surprises:

runner.set_optimization_param('optimization_level', 0)

I appreciate Omria for helping me with my previous question. However, I’ve encountered a new issue now.

(hailodfc) (base) ykl@ykl-virtual-machine:~/hailodfc$ /home/ykl/hailodfc/bin/python /home/ykl/hailodfc/paser4.py
[info] ParsedPerformanceParam command, setting optimization_level(max=2)
[info] Loading model script commands to Aod-Net from string
[info] ParsedPerformanceParam command, setting optimization_level(max=2)
Calibration data shape: (1449, 450, 600, 3), range: 0.0 - 255.0
Traceback (most recent call last):
  File "/home/ykl/hailodfc/paser4.py", line 44, in <module>
    runner.set_optimization_param('enable_dead_layers_removal', False)
AttributeError: 'ClientRunner' object has no attribute 'set_optimization_param'
 'ClientRunner' object has no attribute 'set_optimization_param'.

My code is below

import os
import numpy as np
from hailo_sdk_client import ClientRunner

# Define the model name and HAR file path
model_name = "Aod-Net"
hailo_model_har_name = f"{model_name}_hailo_model.har"

# Make sure the HAR file exists
assert os.path.isfile(hailo_model_har_name), f"HAR file not found: {hailo_model_har_name}"

# Initialize the ClientRunner
runner = ClientRunner(har=hailo_model_har_name)

# Define model script commands
model_script = "\n".join([
    "model_optimization_flavor(optimization_level=1, compression_level=1, batch_size=1)",
    "performance_param(compiler_optimization_level=max)"
])

# Load the model script
runner.load_model_script(model_script)

# Load calibration dataset (numpy array)
calib_dataset_path = "/home/ykl/hailodfc/output/processed_calibration_data.npy"
assert os.path.isfile(calib_dataset_path), f"Calibration data not found: {calib_dataset_path}"

calib_data_np = np.load(calib_dataset_path)

# Check data shape and value range
print(f"Calibration data shape: {calib_data_np.shape}, range: {calib_data_np.min()} - {calib_data_np.max()}")

# Set optimization parameters
runner.set_optimization_param('enable_dead_layers_removal', False)
runner.set_optimization_param('optimization_level', 0)

# Run optimization with the calibration dataset
runner.optimize(calib_data_np)

# Save the optimized and quantized HAR file
quantized_model_har_path = f"{model_name}_quantized_model.har"
runner.save_har(quantized_model_har_path)

print(f"Quantized HAR file saved to: {quantized_model_har_path}")

@omria How to solve the problem : ‘ClientRunner’ object has no attribute ‘set_optimization_param’. Could you please provide guidance on how to fix this problem.

@omria Thank you. I have solved the problem.Just add the commad below in the .alls file:

alls = [
    "pre_quantization_optimization(dead_layers_removal, policy=disabled)"
]
1 Like