Use community chatmodel with chain #5232
-
Checked other resources
Commit to Help
Example Codeapp.get("/test", async (req, res) => {
console.log("halo")
const model = new chatOllama.ChatOllama({
baseUrl: "http://localhost:11434",
model: "llama3",
});
const Chain = new lc.LLMChain({
llm: model,
prompt: `*{userPrompt}*`,
});
const result = await Chain.call({
userPrompt: `Translate "I love programming" into German.`
});
res.status(200).send(result);
}); DescriptionI'am trying to use chatmodel from ollama to integrate with my existing chain, how can i replace the existing openAPI chat model to it? System Info |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
To integrate the import { BaseLanguageModelInterface, BaseLanguageModelCallOptions } from "@langchain/core";
const model: BaseLanguageModelInterface<any, BaseLanguageModelCallOptions> = new ChatOllama({
baseUrl: "http://localhost:11434",
model: "llama2",
format: "json",
}); Make sure to replace the placeholders with the actual values you're using, especially the For more complex interactions, such as translations or integrating specific logic within the chat flow, you can use the import { ChatOllama, OllamaFunctions } from "@langchain/community/chat_models/ollama";
import { AIMessage } from "@langchain/core/messages";
const chatOllama = new ChatOllama({
baseUrl: "http://localhost:11434",
model: "llama2",
format: "json"
});
const ollamaFunctions = new OllamaFunctions({
llm: chatOllama,
});
const inputMessage = new AIMessage({
content: `Translate "I love programming" into German.`
});
const stream = await ollamaFunctions.stream(inputMessage);
const chunks = [];
for await (const chunk of stream) {
chunks.push(chunk);
}
console.log(chunks.join("")); This setup allows you to stream input through the model and collect output chunks, which is particularly useful for handling translations or other complex interactions. Adjust the
|
Beta Was this translation helpful? Give feedback.
-
@dosu thanks and i need to tracking the token usage, in open api chatmodel, i can use callback, but not in ChatOllama |
Beta Was this translation helpful? Give feedback.
To integrate the
ChatOllama
model with your existing chain, you'll need to ensure that theChatOllama
instance conforms to theBaseLanguageModelInterface
expected by LangChain. Here's how you can adjust your code:Make sure to replace the placeholders with the actual values you're using, especially the
baseUrl
andmodel
parameters.For more complex interactions, such as translations or integrating specific logic within the chat…