Skip to content

Commit 620f79d

Browse files
dqbdjacoblee93
andauthored
feat(ai-sdk): documentation (#491)
Co-authored-by: jacoblee93 <[email protected]>
1 parent 35462ee commit 620f79d

File tree

2 files changed

+143
-36
lines changed

2 files changed

+143
-36
lines changed
Binary file not shown.

docs/observability/how_to_guides/tracing/trace_with_vercel_ai_sdk.mdx

Lines changed: 143 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -3,25 +3,20 @@ sidebar_position: 17
33
---
44

55
import { ConfigureSDKEnvironmentCodeTabs } from "@site/src/components/QuickStart";
6-
import { CodeTabs } from "@site/src/components/InstructionsWithCode";
6+
import {
7+
CodeTabs,
8+
typescript,
9+
} from "@site/src/components/InstructionsWithCode";
710

811
# Trace with the Vercel AI SDK (JS/TS only)
912

10-
:::note beta
11-
This feature is currently in beta while Vercel rolls out official telemetry support.
12-
:::
13-
14-
You can use LangSmith to trace runs from the [Vercel AI SDK](https://sdk.vercel.ai/docs/introduction) using
15-
the special `wrapAISDKModel` method. The important detail is that you must wrap the Vercel model wrapper
16-
rather than of the top-level `generateText` or `generateStream` calls.
17-
18-
This guide will walk through an example.
13+
You can use LangSmith to trace runs from the [Vercel AI SDK](https://sdk.vercel.ai/docs/introduction) with our built-in `AISDKExporter` OpenTelemetry trace exporter. This guide will walk through an example.
1914

2015
:::note
21-
The `wrapAISDKModel` method is only available in `langsmith` JS SDK version `>=0.1.40`.
16+
The `AISDKExporter` class is only available in `langsmith` JS SDK version `>=0.2.1`.
2217
:::
2318

24-
### 0. Installation
19+
## 0. Installation
2520

2621
Install the Vercel AI SDK. We use their OpenAI integration for the code snippets below, but you can use any of their
2722
other options as well.
@@ -33,71 +28,183 @@ other options as well.
3328
value: "typescript",
3429
label: "yarn",
3530
language: "bash",
36-
content: `yarn add ai @ai-sdk/openai`,
31+
content: `yarn add ai @ai-sdk/openai zod`,
3732
},
3833
{
3934
value: "npm",
4035
label: "npm",
4136
language: "bash",
42-
content: `npm install ai @ai-sdk/openai`,
37+
content: `npm install ai @ai-sdk/openai zod`,
4338
},
4439
{
4540
value: "pnpm",
4641
label: "pnpm",
4742
language: "bash",
48-
content: `pnpm add ai @ai-sdk/openai`,
43+
content: `pnpm add ai @ai-sdk/openai zod`,
4944
},
5045
]}
5146
/>
5247

53-
### 1. Configure your environment
48+
## 1. Configure your environment
5449

5550
<ConfigureSDKEnvironmentCodeTabs />
5651

57-
### 2. Log a trace
52+
## 2. Log a trace
5853

59-
Once you've set up your environment, wrap an initialized Vercel model as shown below:
54+
### Next.js
55+
56+
First, create a `instrumentation.js` file in your project root. Learn more how to setup OpenTelemetry instrumentation within your Next.js app [here](https://nextjs.org/docs/app/api-reference/file-conventions/instrumentation).
57+
58+
```ts
59+
import { registerOTel } from "@vercel/otel";
60+
import { AISDKExporter } from "langsmith/vercel";
61+
62+
export function register() {
63+
registerOTel({
64+
serviceName: "langsmith-vercel-ai-sdk-example",
65+
// highlight-next-line
66+
traceExporter: new AISDKExporter(),
67+
});
68+
}
69+
```
70+
71+
Afterwards, add the `experimental_telemetry` argument to your AI SDK calls that you want to trace. For convenience, we've included the `AISDKExporter.getSettings()` method which appends additional metadata for LangSmith.
72+
73+
```ts
74+
import { AISDKExporter } from "langsmith/vercel";
75+
76+
await streamText({
77+
model: openai("gpt-4o-mini"),
78+
prompt: "Write a vegetarian lasagna recipe for 4 people.",
79+
// highlight-next-line
80+
experimental_telemetry: AISDKExporter.getSettings(),
81+
});
82+
```
83+
84+
### Node.js
85+
86+
Add the `AISDKExporter` to the trace exporter to your OpenTelemetry setup.
87+
88+
```ts
89+
import { AISDKExporter } from "langsmith/vercel";
90+
91+
import { NodeSDK } from "@opentelemetry/sdk-node";
92+
import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";
93+
94+
const sdk = new NodeSDK({
95+
// highlight-next-line
96+
traceExporter: new AISDKExporter(),
97+
instrumentations: [getNodeAutoInstrumentations()],
98+
});
99+
100+
sdk.start();
101+
```
102+
103+
Afterwards, add the `experimental_telemetry` argument to your AI SDK calls that you want to trace.
104+
105+
:::info
106+
Do not forget to call `await sdk.shutdown()` before your application shuts down in order to flush any remaining traces to LangSmith.
107+
:::
60108

61109
```ts
62-
import { wrapAISDKModel } from "langsmith/wrappers/vercel";
63-
import { openai } from "@ai-sdk/openai";
64110
import { generateText } from "ai";
111+
import { openai } from "@ai-sdk/openai";
112+
import { AISDKExporter } from "langsmith/vercel";
65113

66-
const vercelModel = openai("gpt-4o-mini");
114+
const result = await generateText({
115+
model: openai("gpt-4o-mini"),
116+
prompt: "Write a vegetarian lasagna recipe for 4 people.",
117+
// highlight-next-line
118+
experimental_telemetry: AISDKExporter.getSettings(),
119+
});
67120

68-
const modelWithTracing = wrapAISDKModel(vercelModel);
121+
// highlight-next-line
122+
await sdk.shutdown();
123+
```
69124

70-
const { text } = await generateText({
71-
model: modelWithTracing,
125+
## Customize run name
126+
127+
You can customize the run name by passing the `runName` argument to the `AISDKExporter.getSettings()` method.
128+
129+
```ts
130+
import { AISDKExporter } from "langsmith/vercel";
131+
132+
await generateText({
133+
model: openai("gpt-4o-mini"),
72134
prompt: "Write a vegetarian lasagna recipe for 4 people.",
135+
// highlight-start
136+
experimental_telemetry: AISDKExporter.getSettings({
137+
runName: "my-custom-run-name",
138+
}),
139+
// highlight-end
73140
});
141+
```
142+
143+
## Customize run ID
74144

75-
console.log(text);
145+
You can customize the run ID by passing the `runId` argument to the `AISDKExporter.getSettings()` method. This is especially useful if you want to know the run ID before the run has been completed.
146+
147+
```ts
148+
import { AISDKExporter } from "langsmith/vercel";
149+
150+
await generateText({
151+
model: openai("gpt-4o-mini"),
152+
prompt: "Write a vegetarian lasagna recipe for 4 people.",
153+
// highlight-start
154+
experimental_telemetry: AISDKExporter.getSettings({
155+
runId: "my-custom-run-id",
156+
}),
157+
// highlight-end
158+
});
76159
```
77160

78-
An example trace from running the above code [looks like this](https://smith.langchain.com/public/fbd12847-9485-43cf-a0a3-82c0b3318594/r):
161+
## Custom LangSmith client
162+
163+
You can also pass a LangSmith client instance into the `AISDKExporter` constructor:
164+
165+
```ts
166+
import { AISDKExporter } from "langsmith/vercel";
167+
import { Client } from "langsmith";
79168

80-
![Trace tree for a LangGraph run with LangChain](./static/vercel_ai_sdk_trace.png)
169+
import { NodeSDK } from "@opentelemetry/sdk-node";
170+
import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";
81171

82-
If you are using a streaming method, LangSmith will trace chunks as a single aggregated response:
172+
const langsmithClient = new Client({});
173+
174+
const sdk = new NodeSDK({
175+
// highlight-next-line
176+
traceExporter: new AISDKExporter({ client: langsmithClient }),
177+
instrumentations: [getNodeAutoInstrumentations()],
178+
});
179+
180+
sdk.start();
181+
182+
await generateText({
183+
model: openai("gpt-4o-mini"),
184+
prompt: "Write a vegetarian lasagna recipe for 4 people.",
185+
experimental_telemetry: AISDKExporter.getSettings(),
186+
});
187+
```
188+
189+
## `wrapAISDKModel` (deprecated)
190+
191+
:::note
192+
The `wrapAISDKModel` method is deprecated and will be removed in a future release.
193+
:::
194+
195+
The `wrapAISDKModel` method wraps the Vercel model wrapper and intercept model invocation to send traces to LangSmith. This method is useful if you are using an older version of LangSmith or if you are using `streamUI` / Vercel AI RSC, which currently does not support `experimental_telemetry`.
83196

84197
```ts
85198
import { wrapAISDKModel } from "langsmith/wrappers/vercel";
86199
import { openai } from "@ai-sdk/openai";
87-
import { streamText } from "ai";
200+
import { generateText } from "ai";
88201

89202
const vercelModel = openai("gpt-4o-mini");
90203

91204
const modelWithTracing = wrapAISDKModel(vercelModel);
92205

93-
const { textStream } = await streamText({
206+
await generateText({
94207
model: modelWithTracing,
95208
prompt: "Write a vegetarian lasagna recipe for 4 people.",
96209
});
97-
98-
for await (const chunk of textStream) {
99-
...
100-
}
101210
```
102-
103-
An example trace from running the above code will look the same as the above one from `generateText`.

0 commit comments

Comments
 (0)