Helicone Community Page

Updated 5 months ago

Getting the Error When Attempting to Use `hprompt` in a Next.js App Directory API Route

Getting the error

Plain Text
 ⨯ ../../node_modules/fsevents/fsevents.node
Module parse failed: Unexpected character '�' (1:0)
You may need an appropriate loader to handle this file type, currently no loaders are configured to process this file. See https://webpack.js.org/concepts#loaders
(Source code omitted for this binary file)


traced from:

Plain Text
../../node_modules/@helicone/helicone/dist/async_logger/HeliconeAsyncLogger.js
../../node_modules/@helicone/helicone/dist/index.js


when attempting to use hprompt in a nextjs app directory api route
J
c
55 comments
Hi @caelin if you are trying to do prompt tracking with async it will not work
Is there a recommended way to do prompt tracking in production environments then?
Plain Text
import { hprompt } from "@helicone/helicone";

const chatCompletion = await openai.chat.completions.create(
  {
    messages: [
      {
        role: "user",
        // 2: Add hprompt to any string, and nest any variable in additional brackets `{}`
        content: hprompt`Write a story about ${{ character }}`,
      },
    ],
    model: "gpt-3.5-turbo",
  },
  {
    // 3. Add Prompt Id Header
    headers: {
      "Helicone-Prompt-Id": "prompt_story",
    },
  }
);


This is the recommended approach right?
Or I guess how is prompt tracking intended to be used? In a QA / evaluation pipeline or within our production server?
The intended use is within the production server
Can you use the proxy in your production server?
Yeah we're using the proxy already
For context I've attached our code below:

Plain Text
export const respondLink = async (
  messages: ChatMessage[],
  username?: string,
) => {
  const { text } = await generateText({
    model,
    headers: {
      "Helicone-Prompt-Id": "respond_link",
      "Helicone-User-Id": username,
    },
    system: `Prompt here`,
    messages,
  });

  return text;
};

where generateText is from Vercel AI using the baseUrl baseURL: "https://oai.hconeai.com/v1", - I also tried returning the generate text func directly
Should hprompt be used to set a constant before being used in the generateText function?
Plain Text
const linkPrompt = hprompt`prompt`;

export const respondLink = async (
  messages: ChatMessage[],
  username?: string,
) => {
  const { text } = await generateText({
    model,
    headers: {
      "Helicone-Prompt-Id": "respond_link",
      "Helicone-User-Id": username,
    },
    system: linkPrompt.toString(),
    messages,
  });

  return text;
};


This also doesn't work
And neither does putting hprompt inline
Hi @caelin !

It should be something like this

Plain Text
const linkPrompt = hprompt`prompt ${{ hello: "world" }}`;

export const respondLink = async (
  messages: ChatMessage[],
  username?: string,
) => {
  const { text } = await generateText({
    model,
    headers: {
      "Helicone-Prompt-Id": "respond_link",
      "Helicone-User-Id": username,
    },
    system: linkPrompt,
    messages,
  });

  return text;
};
Still running into the same issue
Plain Text
 ⨯ ../../node_modules/fsevents/fsevents.node
Module parse failed: Unexpected character '�' (1:0)
You may need an appropriate loader to handle this file type, currently no loaders are configured to process this file. See https://webpack.js.org/concepts#loaders
(Source code omitted for this binary file)

Import trace for requested module:
../../node_modules/fsevents/fsevents.node
../../node_modules/fsevents/fsevents.js
../../node_modules/chokidar/lib/fsevents-handler.js
../../node_modules/chokidar/index.js
../../node_modules/nunjucks/src/node-loaders.js
../../node_modules/nunjucks/src/loaders.js
../../node_modules/nunjucks/index.js
../../node_modules/@traceloop/node-server-sdk/dist/index.mjs
../../node_modules/@helicone/helicone/dist/async_logger/HeliconeAsyncLogger.js
../../node_modules/@helicone/helicone/dist/index.js
./src/app/api/webhooks/twilio/_helpers/respond-link.ts
./src/app/api/webhooks/twilio/_helpers/response-delegator.ts
./src/app/api/webhooks/twilio/process-twilio-message.ts
./src/app/api/webhooks/twilio/development/route.ts
 ⨯ ../../node_modules/fsevents/fsevents.node
Module parse failed: Unexpected character '�' (1:0)
You may need an appropriate loader to handle this file type, currently no loaders are configured to process this file. See https://webpack.js.org/concepts#loaders
(Source code omitted for this binary file)
Are you free to hop on a call next week? We are making some changes to our prompt library this week
Chatting to Cole later today!
Plain Text
export const openai = createOpenAI({
  apiKey: env.OPENAI_API_KEY,
  baseURL: "https://oai.hconeai.com/v1",
  headers: {
    "Helicone-Auth": `Bearer ${env.HELICONE_API_KEY}`,
    "Helicone-Property-Environment": process.env.NODE_ENV ?? "development",
  },
});

export const model = openai("gpt-4-vision-preview");
zHi @caelin I am able to get this to work...

Not sure why.


Plain Text
  const openai = createOpenAI({
    apiKey: "sk-",
    baseURL: "https://oai.hconeai.com/v1",
    headers: {
      "Helicone-Auth": `Bearer sk-`,
      "Helicone-Property-Environment": process.env.NODE_ENV ?? "development",
    },
  });

  const { text } = await generateText({
    model: openai("gpt-4-turbo"),
    system: hprompt`Give me a short response!`,
    prompt: hprompt`${{ color: "blue" }} or green`,
    headers: {
      "Helicone-Prompt-Id": "hello",
    },
  });
I am not able to reproduce the issue
Hmm lemme try switching models
We use chat instead of prompt which may cause it?
Can you send package.json
it's a bit bloated
oh let me check with chat
can you send me your generate text?
Nah it wasn't it - saw my open ai dep was at .13 so maybe that's it
🤷‍♂️
Such a classic issue - I'll fio but lmk when the new prompting library is up hopefully that'll fix it
Actually could you run npm ls @opentelemetry/exporter-jaeger
That's doesn't seem to exist in mine so now I'm thinking there's some sort of dependency mismatch issue since we're in a fat monorepo
Plain Text
npm ls @opentelemetry/exporter-jaeger
[email protected] /Users/justin/Documents/repos/promptzero/helicone/web
└── (empty)
Plain Text
lookbk@ /Users/caelinsutch/Github/lookbk
└─┬ @lookbk/[email protected] -> ./packages/next
  └─┬ @helicone/[email protected]
    └─┬ @traceloop/[email protected]
      └── @opentelemetry/[email protected]

is this what you have for npm ls @opentelemetry/sdk-node
ohh interesting... okay I think this will be solved with the new package
let's wait until then I will keep you posted
We released a new package
can you try installing @helionce/prompts instead
and then adding the following tag
hpf instead of hprompt
eveything else should be the same
Seems to work fine but I'm not seeing anything show up in console
sample code for context
Attachment
Screenshot_2024-07-20_at_7.00.14_PM.png
Sessions also aren't tracking properly
Hi @caelin sorry for the delay here
Can you elaborate what you mean they are not tracking properly?
Both sessions and prompts aren't showing up in the dashboard
Hey @caelin are you free to hop on a call sometime tomorrow or friday?
Add a reply
Sign up and join the conversation on Discord