Helicone Community Page

Updated last year

OpenAI - Helicone

hi @abe , that package is good to go. docs for it are here: https://docs.helicone.ai/getting-started/integration-method/openai

although its worth noting, caching won't be available for it. that feature is only on our proxy implementation. here is a comparison: https://docs.helicone.ai/getting-started/proxy-vs-async
a
a
J
12 comments
ok cool -- I'll consider using it, for now I decided to just set the baseurl on the default openai package, and added the headers myself
would caching work if I added the headers myself? or is there something fundementally missing that needs to be added on your side to enable caching?
we perform caching within our cloudflare worker
so its only on the proxy with the cache header
got it, and does the js library use the proxy?
no it does not use the proxy, it writes directly to our servers
and log asynchronously
gotcha, sounds good
thanks for the info!
Hi @abe sorry for the confusion. Our package does support proxy if you want to use it.

https://www.npmjs.com/package/helicone

(The package you shared is depricated, we need to remove it)

It does not support Fetch yet and the OpenAI V4 package. We are adding it this next sprint πŸ™‚

Please let me know if you have any questions.
Attachment
image.png
Add a reply
Sign up and join the conversation on Discord