Friendli
Friendli 通过可扩展、高效的部署选项增强了 AI 应用程序的性能,并优化了成本节省,专门针对高需求的 AI 工作负载。
本教程将指导您使用 LangChain 集成 ChatFriendli
以用于聊天应用程序。ChatFriendli
提供了一种灵活的方法来生成对话式 AI 响应,支持同步和异步调用。
设置
确保安装了 @langchain/community
。
提示
- npm
- Yarn
- pnpm
npm install @langchain/community @langchain/core
yarn add @langchain/community @langchain/core
pnpm add @langchain/community @langchain/core
登录 Friendli Suite 以创建个人访问令牌,并将其设置为 FRIENDLI_TOKEN
环境。您可以将团队 ID 设置为 FRIENDLI_TEAM
环境。
您可以通过选择要使用的模型来初始化 Friendli 聊天模型。默认模型是 meta-llama-3-8b-instruct
。您可以在 docs.friendli.ai 上查看可用模型。
用法
import { ChatFriendli } from "@langchain/community/chat_models/friendli";
const model = new ChatFriendli({
model: "meta-llama-3-8b-instruct", // Default value
friendliToken: process.env.FRIENDLI_TOKEN,
friendliTeam: process.env.FRIENDLI_TEAM,
maxTokens: 800,
temperature: 0.9,
topP: 0.9,
frequencyPenalty: 0,
stop: [],
});
const response = await model.invoke(
"Draft a cover letter for a role in software engineering."
);
console.log(response.content);
/*
Dear [Hiring Manager],
I am excited to apply for the role of Software Engineer at [Company Name]. With my passion for innovation, creativity, and problem-solving, I am confident that I would be a valuable asset to your team.
As a highly motivated and detail-oriented individual, ...
*/
const stream = await model.stream(
"Draft a cover letter for a role in software engineering."
);
for await (const chunk of stream) {
console.log(chunk.content);
}
/*
D
ear
[
H
iring
...
[
Your
Name
]
*/
API 参考
- ChatFriendli 来自
@langchain/community/chat_models/friendli