Hi, I've tried to integrate prompt feature packageless (
https://docs.helicone.ai/features/prompts/quick-start).
I've have an issue as the input prompts aren't parsed in the "prompts" view on helicone site. The string is just rendered as it is with jsx tags:
<helicone-input-prompt key="locale_name">en_UN</helicone-input-prompt>
Comparing to the screen from documentation I would expect that variable is parsed to remove the jsx tags and just shown:
locale_name
With possibility to preview the variable.
Attaching screen from helicone prompt page and another one from documentation for comparison.
I've also send another request with different inputs and it seems that it was marked as the next prompt version. I think version should be updated when other parts (despite of the input prompts) are changed?
It looks like it doesn't even work? However prompt is displayed in the prompt tab but maybe that's just based on the
Helicone-Prompt-Id
header?
I thought there was an issue because I haven't change uri to the gateway. However after changing uri to gateway, the response is 522.
There is curl which I am using (with gateway):
curl -X POST 'https://gateway.hconeai.com/v1/chat/completions' \
-H 'Accept: application/json' \
-H 'Authorization: Bearer token-here \
-H 'Helicone-Property-Locale: locale-here' \
-H 'Helicone-Property-Prompt-Template-Name: template_nameā \
-H 'Helicone-User-Id: [email protected]' \
-H 'Helicone-Cache-Enabled: true' \
-H 'Helicone-Cache-Bucket-Max-Size: 1' \
-H 'Helicone-Retry-Enabled: true' \
-H 'Helicone-Prompt-Id: template_nameā' \
-H 'Helicone-Target-Url: https://oai.hconeai.com' \
-H 'Helicone-Target-Provider: OpenAI' \
-d '{"model":"gpt-4-turbo-preview","messages":[{"role":"user","contentā:āPrompt here}],ātemperature":0,"response_format":{"type":"json_object"}}'