如何使用 BaseChatMessageHistory 与 LangGraph
先决条件
本指南假设你熟悉以下概念
我们建议新的 LangChain 应用程序利用 内置 LangGraph 持久性 来实现记忆。
在某些情况下,用户可能需要继续使用现有的聊天消息历史持久性解决方案。
在这里,我们将展示如何使用 LangChain 聊天消息历史(BaseChatMessageHistory 的实现)与 LangGraph。
设置
process.env.ANTHROPIC_API_KEY = "YOUR_API_KEY";
- npm
- yarn
- pnpm
npm i @langchain/core @langchain/langgraph @langchain/anthropic
yarn add @langchain/core @langchain/langgraph @langchain/anthropic
pnpm add @langchain/core @langchain/langgraph @langchain/anthropic
ChatMessageHistory
消息历史需要由对话 ID 或可能是(用户 ID,对话 ID)的 2 元组参数化。
许多 LangChain 聊天消息历史 将具有 sessionId
或某些 namespace
来允许跟踪不同的对话。请参考具体的实现以检查它是如何参数化的。
内置的 InMemoryChatMessageHistory
不包含此类参数化,因此我们将创建一个字典来跟踪消息历史。
import { InMemoryChatMessageHistory } from "@langchain/core/chat_history";
const chatsBySessionId: Record<string, InMemoryChatMessageHistory> = {};
const getChatHistory = (sessionId: string) => {
let chatHistory: InMemoryChatMessageHistory | undefined =
chatsBySessionId[sessionId];
if (!chatHistory) {
chatHistory = new InMemoryChatMessageHistory();
chatsBySessionId[sessionId] = chatHistory;
}
return chatHistory;
};
与 LangGraph 结合使用
接下来,我们将使用 LangGraph 设置一个基本的聊天机器人。如果你不熟悉 LangGraph,你应该查看以下 快速入门教程。
我们将为聊天模型创建一个 LangGraph 节点,并手动管理对话历史,同时考虑到作为 RunnableConfig 一部分传递的对话 ID。
对话 ID 可以作为 RunnableConfig 的一部分传递(就像我们将在这里做的那样),也可以作为 图状态 的一部分传递。
import { v4 as uuidv4 } from "uuid";
import { ChatAnthropic } from "@langchain/anthropic";
import {
StateGraph,
MessagesAnnotation,
END,
START,
} from "@langchain/langgraph";
import { HumanMessage } from "@langchain/core/messages";
import { RunnableConfig } from "@langchain/core/runnables";
// Define a chat model
const model = new ChatAnthropic({ modelName: "claude-3-haiku-20240307" });
// Define the function that calls the model
const callModel = async (
state: typeof MessagesAnnotation.State,
config: RunnableConfig
): Promise<Partial<typeof MessagesAnnotation.State>> => {
if (!config.configurable?.sessionId) {
throw new Error(
"Make sure that the config includes the following information: {'configurable': {'sessionId': 'some_value'}}"
);
}
const chatHistory = getChatHistory(config.configurable.sessionId as string);
let messages = [...(await chatHistory.getMessages()), ...state.messages];
if (state.messages.length === 1) {
// First message, ensure it's in the chat history
await chatHistory.addMessage(state.messages[0]);
}
const aiMessage = await model.invoke(messages);
// Update the chat history
await chatHistory.addMessage(aiMessage);
return { messages: [aiMessage] };
};
// Define a new graph
const workflow = new StateGraph(MessagesAnnotation)
.addNode("model", callModel)
.addEdge(START, "model")
.addEdge("model", END);
const app = workflow.compile();
// Create a unique session ID to identify the conversation
const sessionId = uuidv4();
const config = { configurable: { sessionId }, streamMode: "values" as const };
const inputMessage = new HumanMessage("hi! I'm bob");
for await (const event of await app.stream(
{ messages: [inputMessage] },
config
)) {
const lastMessage = event.messages[event.messages.length - 1];
console.log(lastMessage.content);
}
// Here, let's confirm that the AI remembers our name!
const followUpMessage = new HumanMessage("what was my name?");
for await (const event of await app.stream(
{ messages: [followUpMessage] },
config
)) {
const lastMessage = event.messages[event.messages.length - 1];
console.log(lastMessage.content);
}
hi! I'm bob
Hello Bob! It's nice to meet you. How can I assist you today?
what was my name?
You said your name is Bob.
与 RunnableWithMessageHistory 结合使用
本操作指南直接使用了 BaseChatMessageHistory
的 messages
和 addMessages
接口。
或者,你可以使用 RunnableWithMessageHistory,因为 LCEL 可以用于任何 LangGraph 节点 中。
为此,请将以下代码替换
const callModel = async (
state: typeof MessagesAnnotation.State,
config: RunnableConfig
): Promise<Partial<typeof MessagesAnnotation.State>> => {
if (!config.configurable?.sessionId) {
throw new Error(
"Make sure that the config includes the following information: {'configurable': {'sessionId': 'some_value'}}"
);
}
const chatHistory = getChatHistory(config.configurable.sessionId as string);
let messages = [...(await chatHistory.getMessages()), ...state.messages];
if (state.messages.length === 1) {
// First message, ensure it's in the chat history
await chatHistory.addMessage(state.messages[0]);
}
const aiMessage = await model.invoke(messages);
// Update the chat history
await chatHistory.addMessage(aiMessage);
return { messages: [aiMessage] };
};
使用在当前应用程序中定义的 RunnableWithMessageHistory
的相应实例。
const runnable = new RunnableWithMessageHistory({
// ... configuration from existing code
});
const callModel = async (
state: typeof MessagesAnnotation.State,
config: RunnableConfig
): Promise<Partial<typeof MessagesAnnotation.State>> => {
// RunnableWithMessageHistory takes care of reading the message history
// and updating it with the new human message and AI response.
const aiMessage = await runnable.invoke(state.messages, config);
return {
messages: [aiMessage],
};
};