Helicone Community Page

Updated yesterday

Integrating Helicone Proxy with AzureChatOpenAI from Langchain

I can't figure out how to get Helicone proxy to work with AzureChatOpenAI from langchain.

This does not work - 404 - Model not found (I've tried every iteration of model I can think of)
Plain Text
const model = new AzureChatOpenAI({
  model: "gpt-4o-mini-2024-07-18",
  temperature: 0,
  maxRetries: 2,
  azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY,
  azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_INSTANCE_NAME,
  azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_DEPLOYMENT_NAME,
  azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION,
  azureOpenAIBasePath: "https://oai.helicone.ai",
  configuration: {
    defaultHeaders: {
      "Helicone-Auth": `Bearer ${process.env.HELICONE_API_KEY}`,
      "Helicone-OpenAI-Api-Base": process.env.AZURE_OPENAI_ENDPOINT,
    },
  },
});


However if I go straight to Azure with the request it works great.

ie this works
Plain Text
const model = new AzureChatOpenAI({
  model: "gpt-4o-mini-2024-07-18",
  temperature: 0,
  maxRetries: 2,
  azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY,
  azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_INSTANCE_NAME,
  azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_DEPLOYMENT_NAME,
  azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION,
});


What am I doing wrong? I feel like I tried every which way to get it working and the docs seem out of date for this.
Add a reply
Sign up and join the conversation on Discord