Trying to use Hailo-Ollama on the Raspberry Pi with VS Code Co-pilot

When I try to use my Ollama models with Co-pilot on my RaspPi with Hat + 2, I get a message from VS Code that I have version 0.5.1 of hailo-ollama and it needs version 0.6.2. I have tried to update my hailo-ollama but it seems to be up to date.Does anyone know if version 0.6.2 is in the works? In the meantime I am trying to use Continue.

Hi @user813,

I can confirm hailo-ollama works with Continue VS Code extension both on X86 and RPI 5.

Thanks,

Do you mind sharing the config used with continue.dev?

Hi @user962 ,

Please pay attention this is a full list or roles and context just for the reference - but hailo-ollama not support all of them.

name: Local Agent
version: 1.0.0
schema: v1
models:
  - name: Hailo 10H Qwen Code     # Custom display name
    provider: ollama              # Must be "ollama" 
    model: qwen2.5-coder:1.5b     # Must match exact model name
    apiBase: http://localhost:8000
    roles:
      - chat
      - autocomplete
      - edit
      - apply
      - embed
      - rerank
context:
  - provider: code
  - provider: docs
  - provider: diff
  - provider: terminal
  - provider: problems
  - provider: folder
  - provider: codebase
1 Like