为什么 100 万令牌的上下文窗口无法解决代理遗忘问题
📄 中文摘要
谷歌推出了具有 100 万令牌上下文窗口的 Gemini,Anthropic 则在推动 20 万令牌的技术。AI 社区对此表示庆祝,认为代理终于可以记住一切。然而,实际上这并不能解决问题。上下文窗口并不是记忆,而是短期注意力。每当新会话开始时,窗口就会清空,代理再次面临完全遗忘的情况。尽管 100 万令牌的上下文窗口可以在一次对话中容纳大约 75 万个单词,但这并不意味着代理能够持久记住用户的偏好或信息。
📄 English Summary
Why 1M token context windows won't solve agent amnesia
Google has launched Gemini with a 1M token context window, while Anthropic is pushing for 200K tokens. The AI community celebrated this development, believing that agents could finally remember everything. However, this is a misconception. Context windows do not equate to memory; they represent short-term attention. Each time a new session begins, the context window resets, leading to total amnesia for the agent once again. Although a 1M token context window can hold around 750,000 words in a single conversation, it does not enable agents to retain user preferences or information over time.
Powered by Cloudflare Workers + Payload CMS + Claude 3.5
数据源: OpenAI, Google AI, DeepMind, AWS ML Blog, HuggingFace 等