Table of Contents
1. From the perspective of challenges, why can TransformerFAM help large models “remember more”?
2. 作業記憶の大規模モデルを使用して、AGI への移行を継続します
Home Technology peripherals AI Google takes action to rectify the 'amnesia' of large models! The feedback attention mechanism helps you 'update' the context, and the era of unlimited memory for large models is coming.

Google takes action to rectify the 'amnesia' of large models! The feedback attention mechanism helps you 'update' the context, and the era of unlimited memory for large models is coming.

Apr 17, 2024 pm 03:40 PM
Google Model attention

Editor | Yi Feng

produced | 51CTO technology stack (WeChat ID: blog51cto)

Google finally took action! We will no longer suffer from the "amnesia" of large models.

TransformerFAM was born, promising to make large models have unlimited memory!

Without further ado, let’s take a look at the “efficacy” of TransformerFAM:

Google takes action to rectify the amnesia of large models! The feedback attention mechanism helps you update the context, and the era of unlimited memory for large models is coming.Picture

The large model is processing long context tasks Performance has been significantly improved!

In the above figure, tasks such as Isabelle and NarrativeQA require the model to understand and process a large amount of contextual information and give accurate answers or summaries to specific questions. In all tasks, the model configured with FAM outperforms all other BSWA configurations, and it can be seen that beyond a certain point, the increase in the number of BSWA memory segments cannot continue to improve its memory capabilities.

It seems that on the way to long texts and long conversations, the "unforgettable" of FAM, a big model, does have something to it.

Google researchers introduced FAM, a novel Transformer architecture - Feedback Attention Memory. It uses feedback loops to enable the network to pay attention to its own drift performance, promote the emergence of the Transformer's internal working memory, and enable it to handle infinitely long sequences.

To put it simply, this strategy is a bit like our strategy to artificially combat the "amnesia" of large models: enter the prompt again before each conversation with the large model. It's just that FAM's approach is more advanced. When the model processes a new data block, it will use the previously processed information (that is, FAM) as a dynamically updated context and integrate it into the current processing process again.

In this way, you can well deal with the problem of "forgetting things". Even better, despite the introduction of feedback mechanisms to maintain long-term working memory, FAM is designed to maintain compatibility with pre-trained models without requiring additional weights. So in theory, the powerful memory of the large model does not make it dull or consume more computing resources.

So, how was such a wonderful TransformerFAM discovered? What are the related technologies?

1. From the perspective of challenges, why can TransformerFAM help large models “remember more”?

The concept of Sliding Window Attention (SWA) is crucial to the design of TransformerFAM.

In the traditional Transformer model, the complexity of self-attention (Self-Attention) increases quadratically as the length of the sequence increases, which limits the model's ability to handle long sequences.

"In the movie Memento (2000), the main character suffers from anterograde amnesia, which means he cannot remember what happened in the past 10 minutes, but his long-term memory is intact , he had to tattoo important information on his body to remember them, similar to the current state of large language models (LLMs)," the paper reads.

Google takes action to rectify the amnesia of large models! The feedback attention mechanism helps you update the context, and the era of unlimited memory for large models is coming.Screenshots from the movie "Memory", the pictures come from the Internet

Sliding Window Attention (Sliding Window Attention), it is an improved attention Mechanism for processing long sequence data. It is inspired by the sliding window technique in computer science. When dealing with natural language processing (NLP) tasks, SWA allows the model to focus on only a fixed-size window of the input sequence at each time step, rather than the entire sequence. Therefore, the advantage of SWA is that it can significantly reduce the computational effort.

Google takes action to rectify the amnesia of large models! The feedback attention mechanism helps you update the context, and the era of unlimited memory for large models is coming.Picture

However, SWA has limitations because its attention span is limited to the window size, which results in the model being unable to consider outside the window. Important information.

TransformerFAM achieves integrated attention, block-level updates, information compression, and global context storage by adding feedback activation to re-input context representation into each block of sliding window attention.

In TransformerFAM, improvements are achieved through feedback loops. Specifically, when processing the current sequence block, the model not only focuses on elements within the current window, but also reintroduces previously processed contextual information (i.e., previous "feedback activation") as additional input into the attention mechanism. In this way, even if the model's attention window slides over the sequence, it is able to maintain memory and understanding of previous information.

So, after these improvements, TransformerFAM gives LLMs the potential to handle infinite length sequences!

2. 作業記憶の大規模モデルを使用して、AGI への移行を継続します

TransformerFAM は研究で前向きな見通しを示しており、これにより AI の長いテキスト タスクの理解と生成の能力が向上することは間違いありません。処理などのパフォーマンスが向上します。文書の要約、ストーリーの作成、Q&A など。

Google takes action to rectify the amnesia of large models! The feedback attention mechanism helps you update the context, and the era of unlimited memory for large models is coming.写真

同時に、それがインテリジェントなアシスタントであれ、感情的なパートナーであれ、無制限のメモリを持つ AI はより魅力的に聞こえます。

興味深いことに、TransformerFAM の設計は生物学の記憶メカニズムに触発されており、AGI が追求する自然知能シミュレーションと一致しています。この論文は、神経科学の概念である注意ベースの作業記憶を深層学習の分野に統合する試みです。

TransformerFAM は、フィードバック ループを通じて大規模なモデルに作業記憶を導入し、モデルが短期的な情報を記憶するだけでなく、長期シーケンスにおける重要な情報の記憶を維持できるようにします。

研究者は、大胆な想像力を通じて、現実世界と抽象概念の間に仮説的な橋を架けます。 TransformerFAM のような革新的な成果が生まれ続けるにつれて、技術的なボトルネックは何度も突破され、よりインテリジェントで相互接続された未来がゆっくりと私たちに向かって展開されています。

AIGC の詳細については、次のサイトをご覧ください:

51CTO AI.x コミュニティ

https://www.51cto.com/aigc/

The above is the detailed content of Google takes action to rectify the 'amnesia' of large models! The feedback attention mechanism helps you 'update' the context, and the era of unlimited memory for large models is coming.. For more information, please follow other related articles on the PHP Chinese website!

Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

SublimeText3 Chinese version

SublimeText3 Chinese version

Chinese version, very easy to use

Zend Studio 13.0.1

Zend Studio 13.0.1

Powerful PHP integrated development environment

Dreamweaver CS6

Dreamweaver CS6

Visual web development tools

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

How to comment deepseek How to comment deepseek Feb 19, 2025 pm 05:42 PM

DeepSeek is a powerful information retrieval tool. Its advantage is that it can deeply mine information, but its disadvantages are that it is slow, the result presentation method is simple, and the database coverage is limited. It needs to be weighed according to specific needs.

How to search deepseek How to search deepseek Feb 19, 2025 pm 05:39 PM

DeepSeek is a proprietary search engine that only searches in a specific database or system, faster and more accurate. When using it, users are advised to read the document, try different search strategies, seek help and feedback on the user experience in order to make the most of their advantages.

Sesame Open Door Exchange Web Page Registration Link Gate Trading App Registration Website Latest Sesame Open Door Exchange Web Page Registration Link Gate Trading App Registration Website Latest Feb 28, 2025 am 11:06 AM

This article introduces the registration process of the Sesame Open Exchange (Gate.io) web version and the Gate trading app in detail. Whether it is web registration or app registration, you need to visit the official website or app store to download the genuine app, then fill in the user name, password, email, mobile phone number and other information, and complete email or mobile phone verification.

Why can't the Bybit exchange link be directly downloaded and installed? Why can't the Bybit exchange link be directly downloaded and installed? Feb 21, 2025 pm 10:57 PM

Why can’t the Bybit exchange link be directly downloaded and installed? Bybit is a cryptocurrency exchange that provides trading services to users. The exchange's mobile apps cannot be downloaded directly through AppStore or GooglePlay for the following reasons: 1. App Store policy restricts Apple and Google from having strict requirements on the types of applications allowed in the app store. Cryptocurrency exchange applications often do not meet these requirements because they involve financial services and require specific regulations and security standards. 2. Laws and regulations Compliance In many countries, activities related to cryptocurrency transactions are regulated or restricted. To comply with these regulations, Bybit Application can only be used through official websites or other authorized channels

Sesame Open Door Trading Platform Download Mobile Version Gateio Trading Platform Download Address Sesame Open Door Trading Platform Download Mobile Version Gateio Trading Platform Download Address Feb 28, 2025 am 10:51 AM

It is crucial to choose a formal channel to download the app and ensure the safety of your account.

Top 10 recommended for crypto digital asset trading APP (2025 global ranking) Top 10 recommended for crypto digital asset trading APP (2025 global ranking) Mar 18, 2025 pm 12:15 PM

This article recommends the top ten cryptocurrency trading platforms worth paying attention to, including Binance, OKX, Gate.io, BitFlyer, KuCoin, Bybit, Coinbase Pro, Kraken, BYDFi and XBIT decentralized exchanges. These platforms have their own advantages in terms of transaction currency quantity, transaction type, security, compliance, and special features. For example, Binance is known for its largest transaction volume and abundant functions in the world, while BitFlyer attracts Asian users with its Japanese Financial Hall license and high security. Choosing a suitable platform requires comprehensive consideration based on your own trading experience, risk tolerance and investment preferences. Hope this article helps you find the best suit for yourself

Sesame Open Door Exchange Web Page Login Latest version gateio official website entrance Sesame Open Door Exchange Web Page Login Latest version gateio official website entrance Mar 04, 2025 pm 11:48 PM

A detailed introduction to the login operation of the Sesame Open Exchange web version, including login steps and password recovery process. It also provides solutions to common problems such as login failure, unable to open the page, and unable to receive verification codes to help you log in to the platform smoothly.

Binance binance official website latest version login portal Binance binance official website latest version login portal Feb 21, 2025 pm 05:42 PM

To access the latest version of Binance website login portal, just follow these simple steps. Go to the official website and click the "Login" button in the upper right corner. Select your existing login method. If you are a new user, please "Register". Enter your registered mobile number or email and password and complete authentication (such as mobile verification code or Google Authenticator). After successful verification, you can access the latest version of Binance official website login portal.

See all articles