Helicone Community Page

Updated 2 months ago

Recommended Setup for Ollama Integration with Helicone

At a glance

The community member is new to Helicone and runs many models with Ollama. They are wondering about the recommended setup, as the SDK documentation does not seem to properly support Ollama. The community member is unsure if they should use the OpenAI API interface instead of the Ollama-specific setup, and would like guidance on the intended setup for Ollama that will properly capture statistics.

In the comments, the community member is pinging the post again, unsure if they are asking in the wrong place or if Helicone support for Ollama is limited. They would love some guidance to know if it is worth digging deeper with the product.

Hey all, new to Helicone coming from arize, I run a lot of models with Ollama and I was wondering what the recommended setup is because I followed to docs to use the SDK and it seems like a lot of things aren't properly wired up there for Ollama? Is it recommended to use the OpenAI API interface and pretend that Ollama isn't Ollama, or what is the intended setup for Ollama that will properly capture statistics?
K
1 comment
Hey all, pinging on this again here, not sure if I'm asking in the wrong place or if Helicone support for Ollama is limited and no one wants to talk about it. 😁 Would love some guidance just so I know if this is worth me digging deeper with the product or not.
Add a reply
Sign up and join the conversation on Discord