Helicone Community Page

Updated 7 months ago

Switching from OpenAI GPT to Self-Hosted LLaMA3s

Hey guys! Old user from Helicone. We are currently switching from openai gpt to self hosted llama3s. I want to ask what is the best way to keep helicone running for us while we switch? We are still using the openai endpoints so baseUrl and apiKey has to be for our self hosted llama3. How can I work around with this?
a
J
3 comments
hey @wayne ! Are you hosting llama3 locally or somewhere else? Also, please check out our gateway integration made for this kind of thing

https://docs.helicone.ai/getting-started/integration-method/gateway#unapproved-domains
@wayne we are happy to help out here synchronously on a call if you have 15 minutes I’d love to show you how to get going
Add a reply
Sign up and join the conversation on Discord