Stream answer from MultiPromptChain #4836
Replies: 1 comment
-
Hey @thijssdaniels, great to see you diving into new challenges with LangChain JS! How's everything going on your end? Based on the code you've provided, it seems you're trying to use the Currently, the However, the To achieve streaming responses from the chains in a const handleResponse = async (promise, name) => {
const { text: result } = await promise;
console.log(`${name}: ${result}`);
};
handleResponse(testPromise1, "Physics");
handleResponse(testPromise2, "Math");
handleResponse(testPromise3, "History"); This approach would allow each response to be processed and output as soon as it's available, which is closer to streaming the responses. However, for a true streaming implementation, especially if the responses themselves are large and can be streamed in parts, you would need to ensure that the underlying language model client ( I hope this helps! If you have any further questions, feel free to ask. Sources
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I'm building a web application where I want the LLM choose among different chains. I've successfully implement this with MultiPromptChain, however, I was wondering if I can stream the answers from the chains instead of providing the responses directly.
System Info
Node = v20.11.1
Yarn = v1.22.21
@langchain/core = 0.1.42
Beta Was this translation helpful? Give feedback.
All reactions