Helicone Community Page

Updated 4 months ago

Prompts & Experiments - Helicone OSS LLM...

How can I use the experiments/prompts feature if my prompt uses multiple system messages as examples?

Here is the example provided in the docs https://docs.helicone.ai/features/prompts
Plain Text
// 1. Add these lines
import { hpf, hpstatic } from "@helicone/prompts";

const chatCompletion = await openai.chat.completions.create(
  {
    messages: [
      {
        role: "system",
        // 2. Use hpstatic for static prompts
        content: hpstatic`You are a creative storyteller.`,
      },
      {
        role: "user",
        // 3: Add hpf to any string, and nest any variable in additional brackets `{}`
        content: hpf`Write a story about ${{ character }}`,
      },
    ],
    model: "gpt-3.5-turbo",
  },
  {
    // 3. Add Prompt Id Header
    headers: {
      "Helicone-Prompt-Id": "prompt_story",
    },
  }
);


My prompt looks like this though
Plain Text
messages: [
      {
        role: "system",
        content: `You are a creative storyteller.`,
      },
      {
        role: "user",
        content: `Example 1`,
      },
      {
        role: "assistant",
        content: "Output 1",
      },
      {
        role: "user",
        content: `Example 2`,
      },
      {
        role: "assistant",
        content: "Output 2",
      },
      {
        role: "user",
        content: input
      },
    ],
J
2 comments
That's fine. If something is not marked with hpstatic or hpf then it will just be marked as an auto-input
It should just work!
Add a reply
Sign up and join the conversation on Discord