跳到主要内容

PlanetScale 聊天记忆

由于 PlanetScale 通过 REST API 工作,您可以将其与 Vercel EdgeCloudflare Workers 和其他 Serverless 环境一起使用。

对于跨聊天会话的长期持久性,您可以将支持聊天记忆类(如 BufferMemory)的默认内存中 chatHistory 替换为 PlanetScale Database 实例。

设置

您将需要在您的项目中安装 @planetscale/database

提示

请参阅此部分,了解有关安装集成软件包的通用说明。

npm install @langchain/openai @planetscale/database @langchain/community @langchain/core

您还需要一个 PlanetScale 帐户和一个要连接的数据库。请参阅 PlanetScale 文档,了解如何创建 HTTP 客户端。

用法

存储在 PlanetScale 数据库中的每个聊天历史会话都必须具有唯一的 ID。config 参数直接传递到 @planetscale/databasenew Client() 构造函数中,并接受所有相同的参数。

import { BufferMemory } from "langchain/memory";
import { PlanetScaleChatMessageHistory } from "@langchain/community/stores/message/planetscale";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";

const memory = new BufferMemory({
chatHistory: new PlanetScaleChatMessageHistory({
tableName: "stored_message",
sessionId: "lc-example",
config: {
url: "ADD_YOURS_HERE", // Override with your own database instance's URL
},
}),
});

const model = new ChatOpenAI();
const chain = new ConversationChain({ llm: model, memory });

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/

const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });

/*
{
res1: {
text: "You said your name was Jim."
}
}
*/

API 参考

高级用法

您还可以直接传入先前创建的 @planetscale/database 客户端实例

import { BufferMemory } from "langchain/memory";
import { PlanetScaleChatMessageHistory } from "@langchain/community/stores/message/planetscale";
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";
import { Client } from "@planetscale/database";

// Create your own Planetscale database client
const client = new Client({
url: "ADD_YOURS_HERE", // Override with your own database instance's URL
});

const memory = new BufferMemory({
chatHistory: new PlanetScaleChatMessageHistory({
tableName: "stored_message",
sessionId: "lc-example",
client, // You can reuse your existing database client
}),
});

const model = new ChatOpenAI();
const chain = new ConversationChain({ llm: model, memory });

const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/

const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });

/*
{
res1: {
text: "You said your name was Jim."
}
}
*/

API 参考


此页面对您有帮助吗?


您也可以留下详细的反馈 在 GitHub 上.