Helicone Community Page

C
K
m
H
E

No Anthropic api_base for helicone-llama-index integration

OpenAI calls via llama-index log to helicone but this does not seem to be possible for anthropic calls - the docs don't mention a specific api base for anthropic but I've tried f"https://anthropic.helicone.ai/%7BHELICONE_API_KEY%7D/v1" which wasn't working. Am I correct in that there is no helicone-llama-index integration for anthropic models?
cc: @Cole
H
C
m
11 comments
No, there is anthropic base present for helicone
Hi, yes we have https://anthropic.helicone.ai.

That should work
@Cole @Haider Hirkani thank you for your help! The requests are being logged on helicone but the custom properties are not being added unlike for OpenAI with llama-index
This is how I am instantiating the anthropic model:
llm = AnthropicMultiModal( model="claude-3-5-sonnet-20240620", max_tokens=4096, temperature=0.0, api_key=api_key, api_base=f"https://anthropic.helicone.ai/{HELICONE_API_KEY}", default_headers={"Helicone-Property-Client": "test", "Helicone-Property-Project": "test", "Helicone-Property-Process-Name": "test", "Helicone-Property-Step-Name": "test", }, )

I am following the same syntax for OpenAI calls via llama-index


I also tried providing auth headers in the default_headers but this also gave the same result

{ "Authorization": f"Bearer {api_key}", "Helicone-Auth": f"Bearer {HELICONE_API_KEY}", "Helicone-Property-Client": "test", "Helicone-Property-Project": "test", "Helicone-Property-Process-Name": "test", "Helicone-Property-Step-Name": "test", }
Attachment
Screenshot_2024-10-08_at_13.01.58.png
helicone_headers = {
"Helicone-Auth": f"Bearer {settings.HELICONE_API_KEY}",
"Helicone-Property-ProjectName": settings.SERVICE_NAME,
"Helicone-Property-Environment": settings.ENVIRONMENT,
"Helicone-Property-UserId": auth_data.user_id,
}
from llama_index.llms.anthropic import Anthropic
Anthropic(
model=llm_config.model,
api_key=ANTHROPIC_API_KEY,
default_headers=helicone_headers,
base_url="https://anthropic.hconeai.com",
temperature=temperature,
max_tokens=max_tokens,
)
it should be logged in this way
Are you using the claude-3-5-sonnet-20240620? I am following the same syntax as you but I am using AnthropicMultiModal instead of Anthropic

AnthropicMultiModal(
model="claude-3-5-sonnet-20240620",
max_tokens=4096,
temperature=0.0,
api_key=api_key,
api_base=f"https://anthropic.helicone.ai/{HELICONE_API_KEY}",
base_url="https://anthropic.hconeai.com/",
default_headers={
"Helicone-Auth": f"Bearer {HELICONE_API_KEY}",
"Helicone-Property-Client": "test client",
"Helicone-Property-Project": "test project",
"Helicone-Property-Process-Name": "test process",
"Helicone-Property-Step-Name": "test step",
}
)
Hi, apologies will try to look into this ASAP.
No worries thanks!
@Haider Hirkani @Cole is there an api_base for OpenAIEmbedding from llama-index
Not that I know of
Add a reply
Sign up and join the conversation on Discord
Join