Helicone Community Page

Updated 4 weeks ago

openrouter and helicone

i am trying this code to add helicone to my model:
def initialize_llm(logger):
"""Initialize LLM with OpenRouter configuration."""
logger.debug("Starting LLM initialization")
try:
# Get API keys
openrouter_api_key = os.getenv("OPENROUTER_API_KEY")
helicone_api_key = os.getenv("HELICONE_API_KEY")

if not openrouter_api_key or not helicone_api_key:
raise ValueError("Missing required API keys")

# Initialize LLM with proper configuration
llm = LLM(
model="anthropic/claude-3-haiku",
custom_llm_provider="openrouter",
api_base="https://openrouter.ai/api/v1/chat/completions", # Changed this
default_headers={
"Authorization": f"Bearer {openrouter_api_key}",
"Helicone-Auth": f"Bearer {helicone_api_key}",
"HTTP-Referer": "https://github.com/your-repo",
"X-Title": "Blog Creation Bot",
"Content-Type": "application/json"
}
)

logger.info("Successfully initialized OpenRouter LLM with Claude and Helicone tracking")
return llm

except Exception as e:
logger.error(f"Error in LLM initialization: {str(e)}\n{traceback.format_exc()}")
raise

but i get error 404

what am i doing wrong here?
thanks!
G
J
K
5 comments
Hello, we are looking into this. Sorry about the response delay.
Hey @Gilad, can you try changing the api_base to https://openrouter.helicone.ai/api/v1/chat/completions instead. This will route the llm call through helicone to openrouter. Sorry about the late reply, and do let me know if you still face any issues!

For more reference: https://docs.helicone.ai/getting-started/integration-method/openrouter
didn't worked as well
Hey, could you send the error message which you're getting from this? Also, could you tell which package are you using for the "LLM" class?
Add a reply
Sign up and join the conversation on Discord