Stream with agentWithChatHistory #4474
Replies: 3 comments
-
🤖 Based on the context provided, it seems like the issue you're facing is related to the handling of streaming in the LangChain framework. The To get the output as it's being generated, you need to listen for changes to the let currentOutput = '';
agentWithChatHistory.stream().subscribe({
next: (update) => {
if (update.ops[0].op === 'add' && update.ops[0].path.startsWith('/logs/ChatOpenAI:2/streamed_output_str')) {
currentOutput += update.ops[0].value;
console.log(currentOutput);
}
},
error: (err) => console.error(err),
complete: () => console.log('Stream completed'),
}); In this example, we're subscribing to the stream and checking each update. If the update is an 'add' operation to the You can find more details in the examples/src/agents/stream_log.ts file in the LangChainJS repository. Additionally, it's worth noting that the Lastly, one user mentioned that the issue was resolved by updating to Bun 0.8, which implies that the problem might have been related to the runtime environment rather than the LangChain framework itself. However, this comment did not receive any reactions, so it should be considered with lower precedence. I hope this helps! If you have any other questions or need further clarification, feel free to ask. Sources
|
Beta Was this translation helpful? Give feedback.
-
@dosu-bot is there an easier way to implement streaming that would allow me to continue using sessions as well |
Beta Was this translation helpful? Give feedback.
-
🤖 Based on the issues you've encountered and the context provided, it seems that the streaming issue with However, modifying the import { AgentExecutor, createStructuredChatAgent } from "langchain/agents";
import { pull } from "langchain/hub";
import type { ChatPromptTemplate } from "@langchain/core/prompts";
import { AIMessage, HumanMessage } from "@langchain/core/messages";
import { ChatOpenAI } from "@langchain/openai";
// Define the tools the agent will have access to.
const tools = [...];
// Get the prompt to use - you can modify this!
const prompt = await pull<ChatPromptTemplate>(
"hwchase17/structured-chat-agent"
);
const llm = new ChatOpenAI({
temperature: 0,
modelName: "gpt-3.5-turbo-1106",
});
const agent = await createStructuredChatAgent({
llm,
tools,
prompt,
});
const agentExecutor = new AgentExecutor({
agent,
tools,
});
const result = await agentExecutor.invoke({
input: "what is LangChain?",
});
// With chat history
const result2 = await agentExecutor.invoke({
input: "what's my name?",
chat_history: [
new HumanMessage("hi! my name is cob"),
new AIMessage("Hello Cob! How can I assist you today?"),
],
}); In this example, the As for the Sources
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
How do I use streaming with agentWithChatHistory, I have tried
however it is only outputting the final piece once it is generated
Beta Was this translation helpful? Give feedback.
All reactions