Neural networks learn more when they are given time to sleep

Artificial neural networks learn more when they spend time “sleeping”, claim researchers, which may allow AI networks to to learn like humans and animals

Neural networks used for cutting-edge artificial intelligence computing systems may benefit from the occasional “sleep”, say researchers.

Writing in the November issue of PLOS Computational Biology, senior author Maxim Bazhenov, Professor of Medicine and a sleep researcher at University of California San Diego School of Medicine and colleagues discuss how biological models may help avoid the threat of “catastrophic forgetting” in artificial neural networks, making them more useful in many research interests. 

The scientists used spiking neural networks that artificially mimic natural neural systems. They found that when the spiking networks were trained on a new task, but with occasional off-line periods that mimicked sleep, catastrophic forgetting was mitigated. Like the human brain, “sleep” for the networks allowed them to replay old memories without explicitly using old training data, say the study authors.

“The brain is very busy when we sleep, repeating what we have learned during the day,” says Maxim Bazhenov, Professor of Medicine and a sleep researcher at University of California San Diego School of Medicine. “Sleep helps reorganise memories and presents them in the most efficient way.”

In previously published work, Bazhenov and colleagues reported how sleep builds rational memory, the ability to remember arbitrary or indirect associations between objects, people or events, and protects against forgetting old memories.

Neural networks have superhuman speed but are forgetful

Artificial neural networks leverage the architecture of the human brain to improve numerous technologies and systems, from basic science and medicine to finance and social media. In some ways, they have achieved superhuman performance, such as computational speed, but they fail in one key aspect - when artificial neural networks learn sequentially, new information overwrites previous information, a phenomenon called “catastrophic forgetting”.

“In contrast, the human brain learns continuously and incorporates new data into existing knowledge,” says Bazhenov, “and it typically learns best when new training is interleaved with periods of sleep for memory consolidation.”

Neurons fire in a specific order and this increases synapses between them when we learn new information, says Bazhenov. During sleep, the spiking patterns learned during our awake state are repeated spontaneously in a process called reactivation or replay. 

“Synaptic plasticity, the capacity to be altered or moulded, is still in place during sleep and it can further enhance synaptic weight patterns that represent the memory, helping to prevent forgetting or to enable transfer of knowledge from old to new tasks.”

When Bazhenov and colleagues applied this approach to artificial neural networks, they found that it helped the networks avoid catastrophic forgetting. 

“It meant that these networks could learn continuously, like humans or animals,” he says. “Understanding how human brain processes information during sleep can help to augment memory in human subjects. Augmenting sleep rhythms can lead to better memory.”

Share

Featured Articles

How Gen AI is Taking the FinTech Sector by Storm

Gen AI is taking the technological bounds of fintech into the future, helping fintechs become more efficient, but where exactly is it being implemented?

Preparing the Workforce for an AI-Native Future

AI Magazine speaks with Clyde Seepersad of the Linux Foundation about how businesses can create more sustainable, effective and responsible AI use cases

AWS Bedrock Gets Anthropic's New Claude 3.5 Sonnet Model

Amazon Bedrock and Anthropic upgrade their partnership to build Gen AI applications... with Gen AI assistance by adding Claude 3.5 Sonnet to copilot tasks

What Dell and Super Micro can Bring Musk’s xAI Supercomputer

Cloud & Infrastructure

Toshiba Takes Another Step to Ushering in Embodied AI

Robotics

Why AWS is Investing $230m in Credits for Gen AI Startups

Cloud & Infrastructure