Helicone Community Page

Updated 4 weeks ago

Helicone and temperature support in openai calls

At a glance

The community member is having an issue with the temperature parameter when using OpenAI calls through Helicone. They are also using LangChain, and are unsure if the issue is with Helicone or LangChain. The comments indicate that Helicone does not inject a default temperature, and that the issue is likely with LangChain, which defaults the temperature to 0.7 if not provided. The resolution is that temperature is not supported by the o1 models, but it still must be included and set to 1.0.

Does anyone know whether Helicone injects a default temperature when passing through OpenAI calls? The o1 models don't support temperature yet, so I'm leaving that out. Receiving an error that temperature isn't supported yet. Also using langchain but not sure whether it's langchain injecting temp or helicone...
Marked as solution
All good now. As it turns out, temperature isn't supported but it still must be included and set to 1.0. Langchain defaults it to 0.7 if not provided.
View full solution
J
M
4 comments
We do not inject temperature unless you are using our prompts feature
It is likely LangChain
Thanks! Will dig into that mess next.
All good now. As it turns out, temperature isn't supported but it still must be included and set to 1.0. Langchain defaults it to 0.7 if not provided.
Add a reply
Sign up and join the conversation on Discord