Helicone Community Page

Updated 3 months ago

"Embeddings are too large to serve in bulk"

I'm trying to switch from OpenAI to Azure OpenAI and now get this on calling embeddings:

{
"heliconeMessage": "Embeddings are too large to serve in bulk"
}

Not sure what this means and how switching to Azure can cause this. Google returns zero results on this.
T
a
J
8 comments
Seems like a fluke as on all other tries I get 'Resource not found'. Man, I thought this would be an easy transition using LangChain, but I guess not.
Hey , we’ll get on this right away. Thanks for flagging it.

When we have an update we’ll lyk
Hi this is actually expected. I am guessing you are seeing this in the GraphQL API?
Just in the JSON view of the request (UI)
I thought I was on to getting Azure to work, but I’ll need to get back to this later.
I thought it would mean changing a few env vars but can’t get it to work.
Basically we have your embeddings but don’t serve it back upwards because they are too large for our serverless functions
do you need to look at the embeddings on the frontend?
Add a reply
Sign up and join the conversation on Discord