Is this when you are trying to access helicone.ai the website?
Sorry for confusing you, I am using langchain to call openai and set OPENAI_API_BASE=
https://oai.hconeai.com/v1, when i access api
and then you are getting this as the response when trying to call openai?
Yes, after the request, it returned 403 and displayed this page
Just like this: Retrying langchain.chat_models.openai.ChatOpenAI.completion_with_retry.<locals>._completion_with_retry in 4.0 seconds as it raised APIError: HTTP code 403 from API (<!DOCTYPE html>
<!--[if lt IE 7]> <html class="no-js ie6 oldie" lang="en-US"> <![endif]-->
<!--[if IE 7]> <html class="no-js ie7 oldie" lang="en-US"> <![endif]-->
This is only displayed when using Helicone, and not OpenAI directly. correct?
I went to do the test, and it worked to call openai directly through postman
I seem to found the reason. Because OpenAI not support in my region(china mainland), we use Cloudflare as a proxy for access. It was worked until three hours ago. I tried using a VPN to change the IP address to another region, and the Helicone API worked. Then i change ip back recived 403. It seems like this is a problem I need to solve. Sorry for the inconvenience.
I tested the cloudflare I deployed and encountered the same issue. It should be openai blocking me through cloudflare, not Helicone.
does Helicone using the "forward for" header?
not sure what your question is
I dont think there is anyway to solve without tunneling your traffic to an appropiate region like US
using China mainland IP access OpenAI through Helicone
so OpenAI see the IP provided by Helicone, not me
only if Helicone tell the OpenAI's Cloudflare,Helicone IS a proxy forward for a China Mainland IP
Helicone will use the closest worker that is deployed. In that case, it will use the worker that is deployed in Mainland China. Sorry
Someone maybe access China's LLM service through Helicone, so, you guys deploy the Helicone worder all over the world. right?
you should take oai.helicone.com different
gateway.helicone.com maybe access China's LLM Service
we using Helicone for two reason: proxy and tracing
I am sorry, there is nothing we can do.
I would recommend setting up your service to proxy traffic outside of China
if we should deploy a proxy by ourself, then, I just using langsmith for tracing
Our proxy was never designed to evade Main Land China's ban on OpenAI API traffic
You need to deploy your app somewhere that is allowed to use OpenAI
I have deployed Cloudflare's worker by myself. Although CF also comes with AI Gateway, I still think that what you have done is comprehensive. I have already deployed a new server in Singapore for forwarding, and it is worked. Can helicone provide a better solution, such as whether CF supports ignoring some nodes for unsupport by openai , such as (hkg1), after all, so more people can enjoy the convenience brought by Helicone.