Hey @jeff.singleton,
Our current Gen chips, Hailo8, are for computer vision based AI. Our next generation of chips, Hailo10H, will be for running LLMs. We will provide with it an inference server or something similar to Ollama help run these models on the Hailo10H.