AI教程 2025年01月15日
0 收藏 0 点赞 285 浏览 6670 个字
摘要 :

面向开发者的LLM入门课程-对话储存英文版提示: 英文版提示 1.对话缓存储存 from langchain.chains import ConversationChain from langchain.chat_models import ChatO……

哈喽!伙伴们,我是小智,你们的AI向导。欢迎来到每日的AI学习时间。今天,我们将一起深入AI的奇妙世界,探索“面向开发者的LLM入门课程-对话储存英文版提示”,并学会本篇文章中所讲的全部知识点。还是那句话“不必远征未知,只需唤醒你的潜能!”跟着小智的步伐,我们终将学有所成,学以致用,并发现自身的更多可能性。话不多说,现在就让我们开始这场激发潜能的AI学习之旅吧。

面向开发者的LLM入门课程-对话储存英文版提示

面向开发者的LLM入门课程-对话储存英文版提示:

英文版提示

1.对话缓存储存

from langchain.chains import ConversationChain
from langchain.chat_models import ChatOpenAI
from langchain.memory import ConversationBufferMemory
llm = ChatOpenAI(temperature=0.0)
memory = ConversationBufferMemory()
conversation = ConversationChain(llm=llm, memory = memory, verbose=True )
print(“第一轮对话:”)
conversation.predict(input=”Hi, my name is Andrew”)
print(“第二轮对话:”)
conversation.predict(input=”What is 1+1?”)
print(“第三轮对话:”)
conversation.predict(input=”What is my name?”)

第一轮对话:
> Entering new chain…
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is
talkative and provides lots of specific details from its context. If the AI does
not know the answer to a question, it truthfully says it does not know.
Current conversation:
Human: Hi, my name is Andrew
AI:
> Finished chain.
第二轮对话:
> Entering new chain…
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is
talkative and provides lots of specific details from its context. If the AI does
not know the answer to a question, it truthfully says it does not know.
Current conversation:
Human: Hi, my name is Andrew
AI: Hello Andrew! It’s nice to meet you. How can I assist you today?
Human: What is 1+1?
AI:
> Finished chain.
第三轮对话:
> Entering new chain…
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is
talkative and provides lots of specific details from its context. If the AI does
not know the answer to a question, it truthfully says it does not know.
Current conversation:
Human: Hi, my name is Andrew
AI: Hello Andrew! It’s nice to meet you. How can I assist you today?
Human: What is 1+1?
AI: 1+1 is equal to 2.
Human: What is my name?
AI:
> Finished chain.

‘Your name is Andrew.’

print(“查看储存缓存方式一:”)
print(memory.buffer)
print(“查看储存缓存方式二:”)
print(memory.load_memory_variables({}))
print(“向缓存区添加指定对话的输入输出, 并查看”)
memory = ConversationBufferMemory() # 新建一个空的对话缓存记忆
memory.save_context({“input”: “Hi”}, {“output”: “What’s up”}) # 向缓存区添加指定对
话的输入输出
print(memory.buffer) # 查看缓存区结果
print(memory.load_memory_variables({}))# 再次加载记忆变量
print(“继续向向缓存区添加指定对话的输入输出, 并查看”)
memory.save_context({“input”: “Not much, just hanging”}, {“output”: “Cool”})
print(memory.buffer) # 查看缓存区结果
print(memory.load_memory_variables({}))# 再次加载记忆变量

查看储存缓存方式一:
Human: Hi, my name is Andrew
AI: Hello Andrew! It’s nice to meet you. How can I assist you today?
Human: What is 1+1?
AI: 1+1 is equal to 2.
Human: What is my name?
AI: Your name is Andrew.
查看储存缓存方式二:
{‘history’: “Human: Hi, my name is AndrewnAI: Hello Andrew! It’s nice to meet
you. How can I assist you today?nHuman: What is 1+1?nAI: 1+1 is equal to
2.nHuman: What is my name?nAI: Your name is Andrew.”}
向缓存区添加指定对话的输入输出, 并查看
Human: Hi
AI: What’s up
{‘history’: “Human: HinAI: What’s up”}
继续向向缓存区添加指定对话的输入输出, 并查看
Human: Hi
AI: What’s up
Human: Not much, just hanging
AI: Cool
{‘history’: “Human: HinAI: What’s upnHuman: Not much, just hangingnAI: Cool”}

2. 对话缓存窗口储存

from langchain.memory import ConversationBufferWindowMemory
# k 为窗口参数,k=1表明只保留一个对话记忆
memory = ConversationBufferWindowMemory(k=1)
# 向memory添加两轮对话
memory.save_context({“input”: “Hi”}, {“output”: “What’s up”})
memory.save_context({“input”: “Not much, just hanging”}, {“output”: “Cool”})
# 并查看记忆变量当前的记录
memory.load_memory_variables({})
llm = ChatOpenAI(temperature=0.0)
memory = ConversationBufferWindowMemory(k=1)
conversation = ConversationChain(llm=llm, memory=memory, verbose=False )
print(“第一轮对话:”)
print(conversation.predict(input=”Hi, my name is Andrew”))
print(“第二轮对话:”)
print(conversation.predict(input=”What is 1+1?”))
print(“第三轮对话:”)
print(conversation.predict(input=”What is my name?”))

第一轮对话:
Hello Andrew! It’s nice to meet you. How can I assist you today?
第二轮对话:
1+1 is equal to 2.
第三轮对话:
I’m sorry, but I don’t have access to personal information.

3. 对话字符缓存储存

from langchain.llms import OpenAI
from langchain.memory import ConversationTokenBufferMemory
memory = ConversationTokenBufferMemory(llm=llm, max_token_limit=30)
memory.save_context({“input”: “AI is what?!”}, {“output”: “Amazing!”})
memory.save_context({“input”: “Backpropagation is what?”}, {“output”:
“Beautiful!”})
memory.save_context({“input”: “Chatbots are what?”}, {“output”: “Charming!”})
print(memory.load_memory_variables({}))

{‘history’: ‘AI: Beautiful!nHuman: Chatbots are what?nAI: Charming!’}

4. 对话摘要缓存储存

from langchain.chains import ConversationChain
from langchain.chat_models import ChatOpenAI
from langchain.memory import ConversationSummaryBufferMemory
# 创建一个长字符串
schedule = “There is a meeting at 8am with your product team.
You will need your powerpoint presentation prepared.
9am-12pm have time to work on your LangChain
project which will go quickly because Langchain is such a powerful tool.
At Noon, lunch at the italian resturant with a customer who is driving
from over an hour away to meet you to understand the latest in AI.
Be sure to bring your laptop to show the latest LLM demo.”
# 使用对话摘要缓存
llm = ChatOpenAI(temperature=0.0)
memory = ConversationSummaryBufferMemory(llm=llm, max_token_limit=100)
memory.save_context({“input”: “Hello”}, {“output”: “What’s up”})
memory.save_context({“input”: “Not much, just hanging”}, {“output”: “Cool”})
memory.save_context({“input”: “What is on the schedule today?”}, {“output”: f”
{schedule}”})
print(“查看对话摘要缓存储存”)
print(memory.load_memory_variables({})[‘history’])
conversation = ConversationChain(llm=llm, memory=memory, verbose=True)
print(“基于对话摘要缓存储存的对话链”)
conversation.predict(input=”What would be a good demo to show?”)
print(“再次查看对话摘要缓存储存”)
print(memory.load_memory_variables({})[‘history’])

查看对话摘要缓存储存
System: The human and AI exchange greetings. The human asks about the schedule
for the day. The AI provides a detailed schedule, including a meeting with the
product team, work on the LangChain project, and a lunch meeting with a customer
interested in AI. The AI emphasizes the importance of bringing a laptop to
showcase the latest LLM demo during the lunch meeting.
基于对话摘要缓存储存的对话链
> Entering new chain…
Prompt after formatting:
The following is a friendly conversation between a human and an AI. The AI is
talkative and provides lots of specific details from its context. If the AI does
not know the answer to a question, it truthfully says it does not know.
Current conversation:
System: The human and AI exchange greetings. The human asks about the schedule
for the day. The AI provides a detailed schedule, including a meeting with the
product team, work on the LangChain project, and a lunch meeting with a customer
interested in AI. The AI emphasizes the importance of bringing a laptop to
showcase the latest LLM demo during the lunch meeting.
Human: What would be a good demo to show?
AI:
> Finished chain.
再次查看对话摘要缓存储存
System: The human and AI exchange greetings and discuss the schedule for the day.
The AI provides a detailed schedule, including a meeting with the product team,
work on the LangChain project, and a lunch meeting with a customer interested in
AI. The AI emphasizes the importance of bringing a laptop to showcase the latest
LLM demo during the lunch meeting. The human asks what would be a good demo to
show, and the AI suggests showcasing the latest LLM (Language Model) demo. The
LLM is a cutting-edge AI model that can generate human-like text based on a given
prompt. It has been trained on a vast amount of data and can generate coherent
and contextually relevant responses. By showcasing the LLM demo, the AI can
demonstrate the capabilities of their AI technology and how it can be applied to
various industries and use cases.

面向开发者的LLM入门课程-大语言模型链
面向开发者的LLM入门课程-大语言模型链:模型链 链(Chains)通常将大语言模型(LLM)与提示(Prompt)结合在一起,基于此,我们可以...

嘿,伙伴们,今天我们的AI探索之旅已经圆满结束。关于“面向开发者的LLM入门课程-对话储存英文版提示”的内容已经分享给大家了。感谢你们的陪伴,希望这次旅程让你对AI能够更了解、更喜欢。谨记,精准提问是解锁AI潜能的钥匙哦!如果有小伙伴想要了解学习更多的AI知识,请关注我们的官网“AI智研社”,保证让你收获满满呦!

微信扫一扫

支付宝扫一扫

版权: 转载请注明出处:https://www.ai-blog.cn/2673.html

相关推荐
01-15

面向开发者的LLM入门课程-路由链: 路由链 到目前为止,我们已经学习了大语言模型链和顺序链。但是…

214
01-15

面向开发者的LLM入门课程-顺序链: 顺序链 当只有一个输入和一个输出时,简单顺序链(SimpleSequen…

285
01-15

面向开发者的LLM入门课程-简单顺序链: 简单顺序链 顺序链(SequentialChains)是按预定义顺序执行…

285
01-15

面向开发者的LLM入门课程-大语言模型链: 模型链 链(Chains)通常将大语言模型(LLM)与提示(Pro…

285
01-15

面向开发者的LLM入门课程-对话储存英文版提示: 英文版提示 1.对话缓存储存 from langchain.chains…

285
01-15

面向开发者的LLM入门课程-对话摘要缓存储存: 对话摘要缓存储存 对话摘要缓存储存,使用 LLM 对到…

285
01-15

面向开发者的LLM入门课程-对话字符缓存储存: 对话字符缓存储存 使用对话字符缓存记忆,内存将限制…

285
01-15

面向开发者的LLM入门课程-对话缓存窗口储存: 对话缓存窗口储存 随着对话变得越来越长,所需的内存…

285
发表评论
暂无评论

还没有评论呢,快来抢沙发~

助力原创内容

快速提升站内名气成为大牛

扫描二维码

手机访问本站