如何将 BaseChatMessageHistory 与 LangGraph 一起使用
先决条件
本指南假定您熟悉以下概念
我们建议新的 LangChain 应用程序利用内置的 LangGraph 持久性来实现内存。
在某些情况下,用户可能需要继续使用现有的持久性解决方案来存储聊天消息历史记录。
在这里,我们将展示如何将LangChain 聊天消息历史记录(BaseChatMessageHistory 的实现)与 LangGraph 一起使用。
设置
process.env.ANTHROPIC_API_KEY = "YOUR_API_KEY";
- npm
- yarn
- pnpm
npm i @langchain/core @langchain/langgraph @langchain/anthropic
yarn add @langchain/core @langchain/langgraph @langchain/anthropic
pnpm add @langchain/core @langchain/langgraph @langchain/anthropic
ChatMessageHistory
消息历史记录需要通过会话 ID 或 (用户 ID,会话 ID) 的二元组进行参数化。
许多LangChain 聊天消息历史记录将具有 sessionId
或某些 namespace
,以允许跟踪不同的对话。请参考具体的实现来检查它是如何参数化的。
内置的 InMemoryChatMessageHistory
不包含这样的参数化,因此我们将创建一个字典来跟踪消息历史记录。
import { InMemoryChatMessageHistory } from "@langchain/core/chat_history";
const chatsBySessionId: Record<string, InMemoryChatMessageHistory> = {};
const getChatHistory = (sessionId: string) => {
let chatHistory: InMemoryChatMessageHistory | undefined =
chatsBySessionId[sessionId];
if (!chatHistory) {
chatHistory = new InMemoryChatMessageHistory();
chatsBySessionId[sessionId] = chatHistory;
}
return chatHistory;
};
与 LangGraph 一起使用
接下来,我们将设置一个使用 LangGraph 的基本聊天机器人。如果您不熟悉 LangGraph,您应该查看以下快速入门教程。
我们将为聊天模型创建一个 LangGraph 节点,并手动管理对话历史记录,同时考虑作为 RunnableConfig 一部分传递的会话 ID。
会话 ID 可以作为 RunnableConfig 的一部分(我们将在此处进行),或者作为 图状态 的一部分传递。
import { v4 as uuidv4 } from "uuid";
import { ChatAnthropic } from "@langchain/anthropic";
import {
StateGraph,
MessagesAnnotation,
END,
START,
} from "@langchain/langgraph";
import { HumanMessage } from "@langchain/core/messages";
import { RunnableConfig } from "@langchain/core/runnables";
// Define a chat model
const model = new ChatAnthropic({ modelName: "claude-3-haiku-20240307" });
// Define the function that calls the model
const callModel = async (
state: typeof MessagesAnnotation.State,
config: RunnableConfig
): Promise<Partial<typeof MessagesAnnotation.State>> => {
if (!config.configurable?.sessionId) {
throw new Error(
"Make sure that the config includes the following information: {'configurable': {'sessionId': 'some_value'}}"
);
}
const chatHistory = getChatHistory(config.configurable.sessionId as string);
let messages = [...(await chatHistory.getMessages()), ...state.messages];
if (state.messages.length === 1) {
// First message, ensure it's in the chat history
await chatHistory.addMessage(state.messages[0]);
}
const aiMessage = await model.invoke(messages);
// Update the chat history
await chatHistory.addMessage(aiMessage);
return { messages: [aiMessage] };
};
// Define a new graph
const workflow = new StateGraph(MessagesAnnotation)
.addNode("model", callModel)
.addEdge(START, "model")
.addEdge("model", END);
const app = workflow.compile();
// Create a unique session ID to identify the conversation
const sessionId = uuidv4();
const config = { configurable: { sessionId }, streamMode: "values" as const };
const inputMessage = new HumanMessage("hi! I'm bob");
for await (const event of await app.stream(
{ messages: [inputMessage] },
config
)) {
const lastMessage = event.messages[event.messages.length - 1];
console.log(lastMessage.content);
}
// Here, let's confirm that the AI remembers our name!
const followUpMessage = new HumanMessage("what was my name?");
for await (const event of await app.stream(
{ messages: [followUpMessage] },
config
)) {
const lastMessage = event.messages[event.messages.length - 1];
console.log(lastMessage.content);
}
hi! I'm bob
Hello Bob! It's nice to meet you. How can I assist you today?
what was my name?
You said your name is Bob.
与 RunnableWithMessageHistory 一起使用
本操作指南直接使用了 BaseChatMessageHistory
的 messages
和 addMessages
接口。
或者,您可以使用 RunnableWithMessageHistory,因为 LCEL 可以在任何 LangGraph 节点 内部使用。
要做到这一点,请替换以下代码
const callModel = async (
state: typeof MessagesAnnotation.State,
config: RunnableConfig
): Promise<Partial<typeof MessagesAnnotation.State>> => {
if (!config.configurable?.sessionId) {
throw new Error(
"Make sure that the config includes the following information: {'configurable': {'sessionId': 'some_value'}}"
);
}
const chatHistory = getChatHistory(config.configurable.sessionId as string);
let messages = [...(await chatHistory.getMessages()), ...state.messages];
if (state.messages.length === 1) {
// First message, ensure it's in the chat history
await chatHistory.addMessage(state.messages[0]);
}
const aiMessage = await model.invoke(messages);
// Update the chat history
await chatHistory.addMessage(aiMessage);
return { messages: [aiMessage] };
};
使用当前应用程序中定义的 RunnableWithMessageHistory
的相应实例。
const runnable = new RunnableWithMessageHistory({
// ... configuration from existing code
});
const callModel = async (
state: typeof MessagesAnnotation.State,
config: RunnableConfig
): Promise<Partial<typeof MessagesAnnotation.State>> => {
// RunnableWithMessageHistory takes care of reading the message history
// and updating it with the new human message and AI response.
const aiMessage = await runnable.invoke(state.messages, config);
return {
messages: [aiMessage],
};
};