raspberry pi5/hailo 10 with 5.3 deb packages

Hi,

I removed most of the previously installed 5.1 libs from the raspberry package repository and installed the version 5.3 from hailo web site

it looks like this now

hailo-gen-ai-model-zoo/now 5.3.0 arm64 [installed,local]
hailo-models/stable,stable,now 1.0.0-2 all [installed,auto-removable]
hailo-tappas-core/stable,now 5.1.0 arm64 [installed,auto-removable]
hailort-pcie-driver/now 5.3.0 all [installed,local]
hailort/now 5.3.0 arm64 [installed,local]
python3-hailo-tappas/stable,now 0:5.1.0 arm64 [installed,auto-removable]
rpicam-apps-hailo-postprocess/stable,now 1.11.1-1 arm64 [installed,auto-removable]

I tried the following and it works

curl --silent http://localhost:8000/api/chat -H ‘Content-Type: application/json’ -d ‘{“model”:“qwen3:1.7b”,“messages”:[{“role”:“user”,“content”:“Translate to French: The cat is on the table.”}]}’

I |2026-04-12 20:47:56 1776019676683461| handle_completion:Got model ‘qwen3:1.7b’, path ‘/home/pi/.local/share/hailo-ollama/models/blob/sha256_cc9b9d1c92e35249b5a9b7bc31fbd652f03bba1232e99b9a8271845ad6f17821’
I |2026-04-12 20:47:56 1776019676683517| GenerationThread:loading model ‘qwen3:1.7b’
I |2026-04-12 20:48:12 1776019692157354| GenerationThread:Finished loading model ‘qwen3:1.7b’
I |2026-04-12 20:48:12 1776019692158149| GenerationThread:got prompt
I |2026-04-12 20:48:12 1776019692158175| GenerationThread:Continuation detected, sending 1 new messages
{“model”:“qwen3:1.7b”,“created_at”:“2026-04-12T18:48:12.782716576Z”,“message”:{“role”:“assistant”,“content”:“Le”},“done”:false}
{“model”:“qwen3:1.7b”,“created_at”:“2026-04-12T18:48:12.991350329Z”,“message”:{“role”:“assistant”,“content”:" chat"},“done”:false}
{“model”:“qwen3:1.7b”,“created_at”:“2026-04-12T18:48:13.197598948Z”,“message”:{“role”:“assistant”,“content”:" est"},“done”:false}
{“model”:“qwen3:1.7b”,“created_at”:“2026-04-12T18:48:13.404074457Z”,“message”:{“role”:“assistant”,“content”:" sur"},“done”:false}
{“model”:“qwen3:1.7b”,“created_at”:“2026-04-12T18:48:13.610919244Z”,“message”:{“role”:“assistant”,“content”:" la"},“done”:false}
{“model”:“qwen3:1.7b”,“created_at”:“2026-04-12T18:48:13.817152512Z”,“message”:{“role”:“assistant”,“content”:" table"},“done”:false}
{“model”:“qwen3:1.7b”,“created_at”:“2026-04-12T18:48:14.023627280Z”,“message”:{“role”:“assistant”,“content”:“.”},“done”:false}
{“model”:“qwen3:1.7b”,“created_at”:“2026-04-12T18:48:14.230441030Z”,“message”:{“role”:“assistant”,“content”:“”},“done”:true,“done_reason”:“stop”,“total_duration”:2071152262,“eval_count”:7}
I |2026-04-12 20:48:14 1776019694232436| generation_context:got notified

but when I try with webui I get this

I |2026-04-12 20:50:53 1776019853843411| handle_completion:Got model ‘qwen3:1.7b’, path ‘/home/pi/.local/share/hailo-ollama/models/blob/sha256_cc9b9d1c92e35249b5a9b7bc31fbd652f03bba1232e99b9a8271845ad6f17821’
I |2026-04-12 20:50:53 1776019853843464| GenerationThread:loading model ‘qwen3:1.7b’
I |2026-04-12 20:51:21 1776019881688227| GenerationThread:Finished loading model ‘qwen3:1.7b’
I |2026-04-12 20:51:21 1776019881688681| GenerationThread:got prompt
I |2026-04-12 20:51:21 1776019881698386| GenerationThread:New conversation, clearing context and sending 2 messages
[HailoRT] [error] Failed to render prompt from JSON strings: [json.exception.parse_error.101] parse error at line 2, column 0: syntax error while parsing value - invalid string: control character U+000A (LF) must be escaped to \u000A or \n; last read: ‘"User Context:<U+000A>’
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INTERNAL_FAILURE(8)
[HailoRT] [error] CHECK_SUCCESS failed with status=HAILO_INTERNAL_FAILURE(8)

Is there anything I need to change.

Thanks

1 Like

Hi @Badz ,
What “webui” have you tried?
Thanks,

Ji

Open webui 0.8.12

It worked with version 5.1 of hailo tools

Thanks

Hello, I have the same problem, I think it’s because of the RAG prompt from Open Web UI interface. Even if I modify RAG prompt, the character ‘<U+000A>’ is added at the end of this prompt, but it is not accepted by Hailo Ollama.

Hi @diconti, @Badz,
We are looking into it.
I’ll keep updating here.
Thanks,

Hi @diconti, @Badz,
We can confirm we found the issue and we are working to fix it.
Thanks,

1 Like