Artificial synapses making memories for more sustainable AI

A team of researchers at Washington University have used quantum tunnelling to boost AI memory consolidation and bring energy costs down for the future

A team of US researchers has developed a device that mimics the dynamics of the brain's synapses - the connections between neurons that allow for the transfer of information - a development that could potentially change how we approach memory consolidation in artificial systems and lead to a new era of energy-efficient and advanced AI technology.

This new device was developed by a team at the McKelvey School of Engineering at Washington University in St Louis led by Professor Shantanu Chakrabartty and uses quantum tunnelling to create an artificial synapse, providing a much simpler and more energy-efficient connection than previous methods. 

The team's research, published in Frontiers in Neuroscience, showed that their artificial synapse could mimic some of the dynamics of biological synapses, allowing AI systems to continuously learn new tasks without forgetting old ones.

“The beauty of this is that we can control this device up to a single electron because we precisely designed this quantum mechanical barrier,” says Chakrabartty.

Artificial synapse solves learning tasks

Chakrabartty and doctoral students Mustafizur Rahman and Subhankar Bose designed a prototype array of 128 hourglass devices on a chip smaller than a millimetre.

“Our work shows that the operation of the FN synapse is near-optimal in terms of the synaptic lifetime and specific consolidation properties,” says Chakrabartty. “This artificial synapse device can solve or implement some of these continual learning tasks where the device doesn’t forget what it has learned before. Now, it allows for long-term and short-term memory on the same device.”

Chakrabartty says because the device uses only a few electrons at a time, it uses very little energy overall.

“Most of these computers used for machine learning tasks shuttle a lot of electrons from the battery, store it on a capacitor, then dump it out and don’t recycle it,” says Chakrabartty. “In our model, we fix the total amount of electrons beforehand and don’t need to inject additional energy because the electrons flow out by the physics itself. By making sure that only a few electrons flow at a time, we can make this device work for long periods of time.” 

The work is part of research Chakrabartty, and his lab members are doing to make AI more sustainable. The energy required for current AI computations is growing exponentially, with the next generation of models requiring nearly 200 terajoules to train one system. And these systems are not even close to reaching the human brain's capacity, which has close to 1,000 trillion synapses.

“Right now, we are not sure about training systems with even half a trillion parameters, and current approaches are not energy-sustainable,” he says. “If we stay on the trajectory that we are on, either something new has to happen to provide enough energy, or we have to figure out how to train these large models using these energy-efficient, dynamic-memory devices.”

Share

Featured Articles

Accenture Commits to Expanding its AI Vision with Adobe

Focusing its AI strategy on company transformation, Accenture partners with Adobe to develop industry-specific solutions using Gen AI to empower businesses

Businesses are not ‘Data Ready’ for Gen AI, says Alteryx

A report by Alteryx finds that organisations must prepare, as they are not ready to unlock real value from Gen AI as a result of insufficient data stacks

TacticAI: Google DeepMind Pioneer a Sports-Led AI Assistant

Google DeepMind’s TacticAI has been launched as part of a research collaboration with Liverpool FC to transform the sporting experience with AI

Bumble: Harnessing AI to Power Human Relationships

Data & Analytics

Kheiron Medical Technology can Detect Cancer with AI Test

Data & Analytics

Who is Mustafa Suleyman? DeepMind Founder Turned AI CEO

Machine Learning