When optimizing har from onnx, got error google.protobuf.message.DecodeError: Error parsing message with type 'tensorflow.FunctionDef'

Hi, I got an error when optimizing har from simplified onnx.

[info] Using dataset with 64 entries for calibration
Calibration: 100%|██████████████████████████████████| 64/64 [02:13<00:00,  2.09s/entries]
[info] LayerNorm Decomposition is done (completion time is 00:02:36.84)
[info] Starting Statistics Collector
[info] Using dataset with 64 entries for calibration
Calibration: 100%|██████████████████████████████████| 64/64 [05:02<00:00,  4.72s/entries]
[info] Statistics Collector is done (completion time is 00:05:18.60)
[info] Output layer detr_res101/conv200 with sigmoid activation was detected. Forcing its output range to be [0, 1] (original range was [0.001021527568809688, 1.0]).
[info] Starting Fix zp_comp Encoding
[info] Fix zp_comp Encoding is done (completion time is 00:00:00.01)
[info] Starting Matmul Equalization
[info] Matmul Equalization is done (completion time is 00:00:09.56)
[info] activation fitting started for detr_res101/reduce_sum_softmax1/act_op
[info] No shifts available for layer detr_res101/conv109/conv_op, using max shift instead. delta=0.1413
[info] No shifts available for layer detr_res101/conv109/conv_op, using max shift instead. delta=0.0706
[info] activation fitting started for detr_res101/reduce_sum_softmax2/act_op
[info] activation fitting started for detr_res101/reduce_sum_softmax3/act_op
[info] activation fitting started for detr_res101/reduce_sum_softmax4/act_op
[info] activation fitting started for detr_res101/reduce_sum_softmax5/act_op
[info] activation fitting started for detr_res101/reduce_sum_softmax6/act_op
[info] No shifts available for layer detr_res101/conv141/conv_op, using max shift instead. delta=0.1080
[info] No shifts available for layer detr_res101/conv141/conv_op, using max shift instead. delta=0.0540
[info] activation fitting started for detr_res101/reduce_sum_softmax7/act_op
[info] activation fitting started for detr_res101/reduce_sum_softmax8/act_op
[info] activation fitting started for detr_res101/reduce_sum_softmax9/act_op
[info] activation fitting started for detr_res101/reduce_sum_softmax10/act_op
[info] activation fitting started for detr_res101/reduce_sum_softmax11/act_op
[info] activation fitting started for detr_res101/reduce_sum_softmax12/act_op
[info] activation fitting started for detr_res101/reduce_sum_softmax13/act_op
[info] activation fitting started for detr_res101/reduce_sum_softmax14/act_op
[info] activation fitting started for detr_res101/reduce_sum_softmax15/act_op
[info] activation fitting started for detr_res101/reduce_sum_softmax16/act_op
[info] No shifts available for layer detr_res101/conv192/conv_op, using max shift instead. delta=0.5160
[info] No shifts available for layer detr_res101/conv192/conv_op, using max shift instead. delta=0.2580
[info] activation fitting started for detr_res101/reduce_sum_softmax17/act_op
[info] No shifts available for layer detr_res101/conv196/conv_op, using max shift instead. delta=0.0865
[info] No shifts available for layer detr_res101/conv196/conv_op, using max shift instead. delta=0.0433
[info] Finetune encoding skipped
[info] Bias Correction skipped
[info] Adaround skipped
[info] Starting Quantization-Aware Fine-Tuning
[info] Using dataset with 1024 entries for finetune
Epoch 1/4
Traceback (most recent call last):
  File "/jarvis/workspace/chaewon/embedded/optimize_har.py", line 23, in <module>
    runner.optimize(calibration_dataset)
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_common/states/states.py", line 16, in wrapped_func
    return func(self, *args, **kwargs)
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py", line 2093, in optimize
    self._optimize(calib_data, data_type=data_type, work_dir=work_dir)
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_common/states/states.py", line 16, in wrapped_func
    return func(self, *args, **kwargs)
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/runner/client_runner.py", line 1935, in _optimize
    self._sdk_backend.full_quantization(calib_data, data_type=data_type, work_dir=work_dir)
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py", line 1045, in full_quantization
    self._full_acceleras_run(self.calibration_data, data_type)
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_sdk_client/sdk_backend/sdk_backend.py", line 1229, in _full_acceleras_run
    optimization_flow.run()
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/tools/orchestator.py", line 306, in wrapper
    return func(self, *args, **kwargs)
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/flows/optimization_flow.py", line 326, in run
    step_func()
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/tools/orchestator.py", line 250, in wrapped
    result = method(*args, **kwargs)
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/tools/subprocess_wrapper.py", line 111, in parent_wrapper
    raise SubprocessTracebackFailure(*child_messages)
hailo_model_optimization.acceleras.utils.acceleras_exceptions.SubprocessTracebackFailure: Subprocess failed with traceback

Traceback (most recent call last):
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/tools/subprocess_wrapper.py", line 73, in child_wrapper
    func(self, *args, **kwargs)
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/flows/optimization_flow.py", line 351, in step2
    self.post_quantization_optimization()
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/tools/orchestator.py", line 250, in wrapped
    result = method(*args, **kwargs)
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/flows/optimization_flow.py", line 410, in post_quantization_optimization
    self._finetune()
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/tools/orchestator.py", line 250, in wrapped
    result = method(*args, **kwargs)
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/flows/optimization_flow.py", line 690, in _finetune
    finetune.run()
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/algorithms/optimization_algorithm.py", line 50, in run
    return super().run()
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/algorithms/algorithm_base.py", line 150, in run
    self._run_int()
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/algorithms/finetune/qft.py", line 387, in _run_int
    self.run_qft(self._model_native, self._model, metrics=self.metrics)
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/hailo_model_optimization/algorithms/finetune/qft.py", line 507, in run_qft
    self.main_train_summary_per_epoch = qft_distiller.fit(
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/keras/utils/traceback_utils.py", line 70, in error_handler
    raise e.with_traceback(filtered_tb) from None
  File "/local/workspace/hailo_virtualenv/lib/python3.10/site-packages/tensorflow/python/eager/polymorphic_function/monomorphic_function.py", line 298, in _get_definition
    function_def.ParseFromString(compat.as_bytes(proto_data))
google.protobuf.message.DecodeError: Error parsing message with type 'tensorflow.FunctionDef'```

I build onnx with detr res101 backbone using opset 17.
after I simplified onnx and tried to optimize har with optimization level 2, I got this error.

how can I fix it?
I'm using data compiler 3.29.0, in docker instance distributed from Hailo.