The community member is having an issue with the temperature parameter when using OpenAI calls through Helicone. They are also using LangChain, and are unsure if the issue is with Helicone or LangChain. The comments indicate that Helicone does not inject a default temperature, and that the issue is likely with LangChain, which defaults the temperature to 0.7 if not provided. The resolution is that temperature is not supported by the o1 models, but it still must be included and set to 1.0.
Does anyone know whether Helicone injects a default temperature when passing through OpenAI calls? The o1 models don't support temperature yet, so I'm leaving that out. Receiving an error that temperature isn't supported yet. Also using langchain but not sure whether it's langchain injecting temp or helicone...