Motörhead 内存
Motörhead 是一个用 Rust 实现的内存服务器。它自动处理后台的增量摘要,并允许无状态应用程序。
安装
请参阅 Motörhead 中的说明,了解如何在本地运行服务器,或访问 https://getmetal.io 获取托管版本的 API 密钥。
用法
提示
请参阅 关于安装集成软件包的一般说明。
- npm
- Yarn
- pnpm
npm install @langchain/openai @langchain/core
yarn add @langchain/openai @langchain/core
pnpm add @langchain/openai @langchain/core
import { ChatOpenAI } from "@langchain/openai";
import { ConversationChain } from "langchain/chains";
import { MotorheadMemory } from "@langchain/community/memory/motorhead_memory";
// Managed Example (visit https://getmetal.io to get your keys)
// const managedMemory = new MotorheadMemory({
// memoryKey: "chat_history",
// sessionId: "test",
// apiKey: "MY_API_KEY",
// clientId: "MY_CLIENT_ID",
// });
// Self Hosted Example
const memory = new MotorheadMemory({
memoryKey: "chat_history",
sessionId: "test",
url: "localhost:8080", // Required for self hosted
});
const model = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0,
});
const chain = new ConversationChain({ llm: model, memory });
const res1 = await chain.invoke({ input: "Hi! I'm Jim." });
console.log({ res1 });
/*
{
res1: {
text: "Hello Jim! It's nice to meet you. My name is AI. How may I assist you today?"
}
}
*/
const res2 = await chain.invoke({ input: "What did I just say my name was?" });
console.log({ res2 });
/*
{
res1: {
text: "You said your name was Jim."
}
}
*/
API 参考
- ChatOpenAI 来自
@langchain/openai
- ConversationChain 来自
langchain/chains
- MotorheadMemory 来自
@langchain/community/memory/motorhead_memory