23:41
Google proposes Titans: Breaking through the limitations of Computing Power and expanding the context
On February 25, Google Research released a new study on Titans. Through the introduction of a new neural long-term memory module, three-head collaborative architecture and hardware optimization design modules, the context window of the large model is expanded to 2 million tokens while the Computing Power is only increased by 1.8 times. Titans not only solves the computing power bottleneck of the Transformer model in long context processing, but also simulates the hierarchical mechanism of the human memory system through biomimetic design, and realizes the accurate inference of the ultra-long context of 2 million tokens for the first time.
TOKEN-1.56%
- 2

