MIT models mean a new life for mobile devices training AI

Low-powered mobile devices such as smartphones and even keyboards could provide life-long learning for artificial intelligence models using edge computing

Researchers at MIT and the MIT-IBM Watson AI Lab have developed a technique that enables on-device training using less than a quarter of a megabyte of memory.  The new approach enables artificial intelligence (AI) models to learn from new data on devices like smartphones and sensors, reducing energy costs and privacy risks.

Microcontrollers power billions of connected devices but have limited memory and no operating system, which makes it difficult to train artificial intelligence models on these “edge devices”. Training a machine-learning model on an intelligent edge device allows it to adapt to new data and make better predictions, say the MIT researchers.

Training a model on a smart keyboard could enable the keyboard to continually learn from the user’s writing, for example. But the training process requires an amount of memory that is typically provided by a centralised data centre.

The intelligent algorithms and framework the researchers have developed reduce the amount of computation required to train a model, which makes the process faster and more memory efficient. This technique also preserves privacy by keeping data on the device, which could be especially beneficial when data are sensitive, such as in medical applications. 

“Our study enables IoT devices to not only perform inference but also continuously update the AI models to newly collected data, paving the way for lifelong on-device learning,” says Song Han, an Associate Professor in the Department of Electrical Engineering and Computer Science (EECS), a member of the MIT-IBM Watson AI Lab, and senior author of a paper describing the new techniques. “The low resource utilisation makes deep learning more accessible and can have a broader reach, especially for low-power edge devices.”

New MIT machine learning models build on TinyML advances

Co-lead authors on the paper are EECS PhD students Ji Lin and Ligeng Zhu, as well as MIT postdocs Wei-Ming Chen and Wei-Chen Wang, and Chuang Gan, a principal research staff member at the MIT-IBM Watson AI Lab. The research will be presented at the Conference on Neural Information Processing Systems.

Han and his team previously addressed the memory and computational bottlenecks that exist when trying to run machine-learning models on tiny edge devices, as part of their TinyML initiative.

Han and his collaborators employed two algorithmic solutions to make the training process more efficient and less memory-intensive.  The researchers developed a system, called a tiny training engine, that can run algorithmic innovations on a simple microcontroller that lacks an operating system. 

“On-device learning is the next major advance we are working toward for the connected intelligent edge,” says Jilei Hou, Vice President and Head of AI Research at Qualcomm, who helped fund the research. “Professor Song Han’s group has shown great progress in demonstrating the effectiveness of edge devices for training.”

This work is funded by the National Science Foundation, the MIT-IBM Watson AI Lab, the MIT AI Hardware Program, Amazon, Intel, Qualcomm, Ford Motor Company, and Google.

Share

Featured Articles

Alteryx Industry-First AI Copilot Sees New Era of Analytics

Alteryx unveils AiDIN Copilot, the first AI assistant that chats with users to build data analysis workflows

Tamr’s Anthony Deighton: Integrating AI into Enterprise Data

AI Magazine speaks with Anthony Deighton, General Manager of Data Products at Tamr, about the power of AI and how it can be harnessed to transform data

IBM and Tech Mahindra Seek to Power Up Gen AI Adoption

Both tech giants are partnering to harness the power of IBM's watsonx to help enterprises accelerate Gen AI adoption and offer new governance capabilities

NASA's First Chief AI Officer Shows AI's Value Cross Sector

AI Strategy

OpenAI: A Look at the AI Trailblazer’s Leadership Landscape

AI Strategy

Who Are Microsoft's LLM Contemporaries?

Machine Learning