Artificial synapses making memories for more sustainable AI

A team of researchers at Washington University have used quantum tunnelling to boost AI memory consolidation and bring energy costs down for the future

A team of US researchers has developed a device that mimics the dynamics of the brain's synapses - the connections between neurons that allow for the transfer of information - a development that could potentially change how we approach memory consolidation in artificial systems and lead to a new era of energy-efficient and advanced AI technology.

This new device was developed by a team at the McKelvey School of Engineering at Washington University in St Louis led by Professor Shantanu Chakrabartty and uses quantum tunnelling to create an artificial synapse, providing a much simpler and more energy-efficient connection than previous methods. 

The team's research, published in Frontiers in Neuroscience, showed that their artificial synapse could mimic some of the dynamics of biological synapses, allowing AI systems to continuously learn new tasks without forgetting old ones.

“The beauty of this is that we can control this device up to a single electron because we precisely designed this quantum mechanical barrier,” says Chakrabartty.

Artificial synapse solves learning tasks

Chakrabartty and doctoral students Mustafizur Rahman and Subhankar Bose designed a prototype array of 128 hourglass devices on a chip smaller than a millimetre.

“Our work shows that the operation of the FN synapse is near-optimal in terms of the synaptic lifetime and specific consolidation properties,” says Chakrabartty. “This artificial synapse device can solve or implement some of these continual learning tasks where the device doesn’t forget what it has learned before. Now, it allows for long-term and short-term memory on the same device.”

Chakrabartty says because the device uses only a few electrons at a time, it uses very little energy overall.

“Most of these computers used for machine learning tasks shuttle a lot of electrons from the battery, store it on a capacitor, then dump it out and don’t recycle it,” says Chakrabartty. “In our model, we fix the total amount of electrons beforehand and don’t need to inject additional energy because the electrons flow out by the physics itself. By making sure that only a few electrons flow at a time, we can make this device work for long periods of time.” 

The work is part of research Chakrabartty, and his lab members are doing to make AI more sustainable. The energy required for current AI computations is growing exponentially, with the next generation of models requiring nearly 200 terajoules to train one system. And these systems are not even close to reaching the human brain's capacity, which has close to 1,000 trillion synapses.

“Right now, we are not sure about training systems with even half a trillion parameters, and current approaches are not energy-sustainable,” he says. “If we stay on the trajectory that we are on, either something new has to happen to provide enough energy, or we have to figure out how to train these large models using these energy-efficient, dynamic-memory devices.”

Share

Featured Articles

Andrew Ng Joins Amazon Board to Support Enterprise AI

In the wake of Andrew Ng being appointed Amazon's Board of Directors, we consider his career from education towards artificial general intelligence (AGI)

GPT-4 Turbo: OpenAI Enhances ChatGPT AI Model for Developers

OpenAI announces updates for its GPT-4 Turbo model to improve efficiencies for AI developers and to remain competitive in a changing business landscape

Meta Launches AI Tools to Protect Against Online Image Abuse

Tech giant Meta has unveiled a range of new AI tools to filter out unwanted images via its Instagram platform and is working to thwart threat actors

Microsoft in Japan: Investing in AI Skills to Boost Future

Cloud & Infrastructure

Microsoft to Open New Hub to Advance State-of-the-Art AI

AI Strategy

SAP Continues to Develop its Enterprise AI Cloud Strategy

AI Applications