Artificial Intelligence Demonstrates Brain-like Memory Formation Process
-
Researchers at the Center for Cognition and Sociality of the Institute for Basic Science in South Korea have discovered striking similarities between memory processing in AI models and the human hippocampus. This new finding provides fresh insights into memory consolidation—the process of converting short-term memories into long-term memories in AI systems.
In the race to develop artificial general intelligence (AGI), understanding and replicating human-like intelligence has become a crucial research focus. At the heart of these technological advancements lies the Transformer model, whose fundamental principles are now being deeply explored.
The key to powerful AI systems lies in understanding how they learn and retain information. The research team applied principles of human brain learning, particularly the way memories are consolidated through NMDA receptors in the hippocampus, to AI models.
NMDA receptors act like intelligent gates in the brain, facilitating learning and memory formation. When the chemical glutamate is present in the brain, nerve cells become excited. Magnesium ions, meanwhile, serve as tiny gatekeepers blocking the passage.
Only when these ionic gatekeepers step aside are substances allowed to flow into the cells. This is the process by which the brain creates and preserves memories, and the role of the gatekeeper (magnesium ions) is highly specific throughout the entire process.
The research team discovered that Transformer models appear to use a gating process analogous to the NMDA receptors in the brain. This finding prompted the team to further investigate whether Transformer's memory consolidation could be controlled through mechanisms resembling the NMDA receptor gating process.
In animal brains, low magnesium levels impair memory function. Researchers found that long-term memory in Transformers can be improved by mimicking NMDA receptors. Just as in the brain, variations in magnesium concentration affect memory strength, and adjusting Transformer parameters to reflect the gating effects of NMDA receptors can enhance the AI model's memory capacity.
This breakthrough discovery not only enables deeper research into how the brain works but also facilitates the development of more advanced AI systems based on these insights.
The study demonstrates that the learning mechanisms of AI models can be explained using established neuroscience knowledge. This result represents a crucial step forward in the convergence of AI and neuroscience. It also signifies significant progress in simulating human-like memory consolidation. The integration of human cognitive mechanisms and AI design holds promise not only for creating cost-effective, high-performance AI systems but also for providing valuable insights into brain function research through AI models.