Is there an Ollama alternative that support Hailo?

Yep. As the topic says, I’m looking for an Ollama alternative that might be willing to or already does support Hailo chips.

After reading and commenting on the Ollama github, it is clear they are not interested in supporting Hailo natively.

Rather than search the web, I am trying this way of opening a topic to try and generate some discussion first. Maybe the word will get out to developers and spawn new projects.

Note: I have a very specific use case I am building out. I need that Ollama like API capability with native Hailo support to do it.

Regards,
JPOP

Hey @jeff.singleton,

Our current Gen chips, Hailo8, are for computer vision based AI. Our next generation of chips, Hailo10H, will be for running LLMs. We will provide with it an inference server or something similar to Ollama help run these models on the Hailo10H.

Thank-you! Does the 10H work on Raspberry Pi AI Hat?