Log in
Log into community
Helicone Community Page
View all posts
Related posts
Did this answer your question?
😞
😐
😃
Powered by
Hall
Inactive
Updated 3 months ago
0
Follow
"Embeddings are too large to serve in bulk"
"Embeddings are too large to serve in bulk"
Inactive
0
Follow
T
Tim van Heugten
last year
·
I'm trying to switch from OpenAI to Azure OpenAI and now get this on calling embeddings:
{
"heliconeMessage": "Embeddings are too large to serve in bulk"
}
Not sure what this means and how switching to Azure can cause this. Google returns zero results on this.
T
a
J
8 comments
Share
Open in Discord
T
Tim van Heugten
last year
Seems like a fluke as on all other tries I get 'Resource not found'. Man, I thought this would be an easy transition using LangChain, but I guess not.
a
ayoKho
last year
Hey , we’ll get on this right away. Thanks for flagging it.
When we have an update we’ll lyk
J
Justin
last year
Hi this is actually expected. I am guessing you are seeing this in the GraphQL API?
T
Tim van Heugten
last year
Just in the JSON view of the request (UI)
T
Tim van Heugten
last year
I thought I was on to getting Azure to work, but I’ll need to get back to this later.
T
Tim van Heugten
last year
I thought it would mean changing a few env vars but can’t get it to work.
J
Justin
last year
Basically we have your embeddings but don’t serve it back upwards because they are too large for our serverless functions
J
Justin
last year
do you need to look at the embeddings on the frontend?
Add a reply
Sign up and join the conversation on Discord
Join on Discord