Skip to content

Commit d51b74b

Browse files
authored
docs(langchainjs): Update LangChain.js callback backgrounding recommendations (#431)
On hold until JS docs site has fully updated.
1 parent 80b4e60 commit d51b74b

File tree

2 files changed

+10
-3
lines changed

2 files changed

+10
-3
lines changed

docs/tracing/faq/langchain_specific_guides.mdx

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -187,7 +187,8 @@ This tactic is also useful for when you have multiple chains running in a shared
187187

188188
In LangChain Python, LangSmith's tracing is done in a background thread to avoid obstructing your production application. This means that your process may end before all traces are successfully posted to LangSmith. This is especially prevalent in a serverless environment, where your VM may be terminated immediately once your chain or agent completes.
189189

190-
In LangChain JS, the default is to block for a short period of time for the trace to finish due to the greater popularity of serverless environments. You can make callbacks asynchronous by setting the `LANGCHAIN_CALLBACKS_BACKGROUND` environment variable to `"true"`.
190+
In LangChain JS, prior to `@langchain/core` version `0.3.0`, the default was to block for a short period of time for the trace to finish due to the greater popularity of serverless environments. Versions `>=0.3.0` will have the same default as Python.
191+
You can explicitly make callbacks synchronous by setting the `LANGCHAIN_CALLBACKS_BACKGROUND` environment variable to `"false"` or asynchronous by setting it to `"true"`. You can also check out [this guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more options for awaiting backgrounded callbacks in serverless environments.
191192

192193
For both languages, LangChain exposes methods to wait for traces to be submitted before exiting your application.
193194
Below is an example:

src/components/QuickStart.js

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -258,9 +258,15 @@ export function ConfigureLangChainEnvironmentCodeTabs() {
258258
export LANGCHAIN_API_KEY=<your-api-key>
259259
# The below examples use the OpenAI API, though it's not necessary in general
260260
export OPENAI_API_KEY=<your-openai-api-key>`;
261-
const typescriptFootnote = `If you are using LangChain with LangSmith and are not in a serverless environment, we also suggest setting the following to reduce latency:
261+
const typescriptFootnote = `If you are using LangChain.js with LangSmith and are not in a serverless environment, we also recommend setting the following explicitly to reduce latency:
262262
263-
\`export LANGCHAIN_CALLBACKS_BACKGROUND=true\``;
263+
\`export LANGCHAIN_CALLBACKS_BACKGROUND=true\`
264+
265+
If you are in a serverless environment, we recommend setting the reverse to allow tracing to finish before your function ends:
266+
267+
\`export LANGCHAIN_CALLBACKS_BACKGROUND=false\`
268+
269+
See [this LangChain.js guide](https://js.langchain.com/docs/how_to/callbacks_serverless) for more information.`;
264270
return (
265271
<CodeTabs
266272
tabs={[

0 commit comments

Comments
 (0)