Nov 4, 2020

How AI is helping to build better EV batteries

Paddy Smith
3 min
ev batteries
To compete alongside combustion engines, EVs must overcome their problematic batteries. AI is doing the heavy lifting...

AI is crucial to the development of better electric vehicle (EV) batteries, according to research by the Wall Street Journal.

Batteries are the weak point in portable (and roadgoing) electronics, with the core technology having improved little since the inception of lithium ion cells. While processing power continues to leap forward, production battery technology is at a standstill. As a result the biggest advances in battery technology are in charging, consumption and power management – all driven by processing power.

Batteries: the heart of EVs

It’s not a comforting thought for prospective EV customers, who face ‘range anxiety’, time-expensive charging patterns and the certain degradation of a part of their vehicle that represents a quarter of its cost.

Now unprecedented advances are being made in the field that will be crucial to making EVs attractive to mainstream consumers. And they are being led by AI.

The US Department of Energy has set up the Joint Center for Energy Storage Research at the University of Chicago to explore battery improvements. Its director, George Crabtree, explained that the chemical mixture used in lithium batteries used to be based on human trial and error, but that “machine learning speeds up this materials discovery process by orders of magnitude”.

AI’s chemical recipes

The AI can pick through scientific papers online and material databases, and run modelling for different combinations of chemical components in cells to ascertain optimum performance for fast charging, long life and obsolescence.

IBM has also been exploring alternatives to nickel and cobalt in a bid to find more sustainable materials and reduce costs. The job of evaluating the 20,000 possible compounds to use as the electrolyte would have taken some five years without AI. IBM was able to employ machine learning to get the job done in nine days.

The company’s lab director told the WSJ, “The AI is now becoming much more central as we tweak the solvents and electrolytes to get the battery even better in terms of capacity and life cycle, because we have had more time to train it.”

AI testing for EV batteries

Panasonic, which makes batteries for Tesla and Ford, has managed to reduce testing times for charge cycles to around six months in some cases. Without AI, the batteries would have been charged and discharged over a period of three years.

Stanford University, MIT and the Toyota Research Institute has a joint project to work on how to charge EV batteries in 10 minutes, while preserving the life expectancy of the cells. In a report published in Nature, they explained how they had reduced testing time from 500 days to just 16. The study’s co-author William Chueh said, ““The AI aspect of our work is all about speeding up the research-and-development cycle so that we can iterate faster and faster.”

Share article

Jun 22, 2021

What is neuromorphic AI?

2 min
Neuromorphic computing – or neuromorphic AI – is the hardware side of artificial intelligence, changing the rules for the future of machine learning

AI is dead. Long live AI?


AI is evolving. The first generation of machine learning used ordinary logic and rules to draw conclusions in a very specific manner. A good example would be IBM’s Deep Blue computer, which was trained to play chess to championship standard. That hasn’t disappeared, but it has been augmented by more perceptive deep learning networks that can analyze a broader set of parameters and provide intelligent insights.


And neuromorphic AI is next?


Correct. Neuromorphic computing is a way of designing hardware – microprocessors, really – to work more like human brains. The idea is that this new iteration of AI hardware will allow machine learning of the future to deal better with ambiguity and contradiction, things that are currently difficult to process for computers.


How does neuromorphic AI work?


The problem with current chip architecture is that it is not very efficient. Because of the linearity of the process, the chips have to built with a massive amount of horsepower just in case it’s needed. Building a human brain that way would be unfeasible, so engineers have had to rethink the nature of chip design in their quest to get computers to perform more of the tasks human brains are good at. Enter SNNs.


What’s an SNN?


A spiking neural network (SNN) is, in the words of chipmaker Intel, “a novel model for arranging those elements to emulate natural neural networks that exist in biological brains.” Each ‘neuron’ fires independently, triggering other neurons only when they are required. Intel again: “By encoding information within the signals themselves and their timing, SNNs simulate natural learning processes by dynamically remapping the synapses between artificial neurons in response to stimuli.”

Share article