AI and AR make mobile phones a lifesaver for new mothers

Artificial intelligence (AI), augmented reality (AR) and real-time computer vision are being used to provide new mothers in Rwanda with medical support

Doctors and researchers from MIT, Harvard University, and Partners in Health (PIH) in Rwanda have developed a mobile health platform that uses artificial intelligence (AI), augmented reality (AR)  and real-time computer vision to predict infection in Caesarean section wounds. 

Known as mHealth and pairing simple mobile phones with AI-powered algorithms and technologies, the platform promises accuracy levels of around 90%.

“Early detection of infection is an important issue worldwide, but in low-resource areas such as rural Rwanda, the problem is even more dire due to a lack of trained doctors and the high prevalence of bacterial infections that are resistant to antibiotics,” says the team’s technology lead Richard Ribon Fletcher, who is a research scientist in mechanical engineering at MIT. “Our idea was to employ mobile phones that could be used by community health workers to visit new mothers in their homes and inspect their wounds to detect infection.”

This summer, the team - which is led by Bethany Hedt-Gauthier, a professor at Harvard Medical School - was awarded US$500,000 first-place prize in the NIH Technology Accelerator Challenge for Maternal Health. The project first got underway in 2017 when Fletcher met with Hedt-Gauthier during an NIH investigator meeting at a time when Hedt-Gauthier was looking for support in the Caesarean care work she was working on in Rwanda. 

“The lives of women who deliver by Caesarean section in the developing world are compromised by both limited access to quality surgery and postpartum care,” adds Fredrick Kateera, a team member from PIH. “Use of mobile health technologies for early identification, plausible accurate diagnosis of those with surgical site infections within these communities would be a scalable game changer in optimising women's health.”

Real-time computer vision and AR help AI algorithms

The first step in the project was to collect a database of wound images taken by community health workers in rural Rwanda. Over 1,000 images of both infected and non-infected wounds were collected and used to train an algorithm, but many of the photographs were of poor quality.

Fletcher used real-time computer vision and augmented reality to improve quality with real-time image processing. “By using real-time computer vision at the time of data collection, we are able to generate beautiful, clean, uniform colour-balanced images that can then be used to train our machine learning models, without any need for manual data cleaning or post-processing,” says Fletcher.

Using convolutional neural net (CNN) machine learning models, the software has been able to successfully predict infection in C-section wounds with roughly 90% accuracy within 10 days of childbirth. Women predicted to have an infection are then given a referral to a clinic where they can receive additional testing and prescribed life-saving antibiotics as needed.

“The trust that women have in community health workers, who were a big promoter of the app, meant the mHealth tool was accepted by women in rural areas,” says Anne Niyigena of PIH.

One of the biggest hurdles to scaling this AI-based technology to a more global audience is algorithmic bias – when images of patients of varying skin colours are introduced, the algorithm was seen to be less effective. Thermal cameras attached to a cell phone were introduced to capture infrared images of wounds, and algorithms were trained using the heat patterns of infrared wound images to predict infection.

“We’re giving the health staff two options: in a homogenous population, like rural Rwanda, they can use their standard phone camera, using the model that has been trained with data from the local population. Otherwise, they can use the more general model which requires the thermal camera attachment,” says Fletcher.

While the current generation of the mobile app uses a cloud-based algorithm to run the infection prediction model, the team is working on a stand-alone, connection-free mobile app that looks at other aspects of maternal health.

Share

Featured Articles

ChatGPT one year on: adoption, transformation, regulation

OpenAI’s ChatGPT turns one year old today and, with much global conversation taking place, businesses continue to seek the benefits of generative AI

IBM continues to address climate change via geospatial AI

Ahead of COP28, IBM announce collaborations with NASA & Mohamed Bin Zayed University of AI to expand upon IBM geospatial AI tech and monitor climate change

Agility Robotics: Digit, Amazon and workplace efficiencies

From Amazon partnerships, to creating humanoid robots who can pick themselves up off the ground, Agility Robotics continues to innovate in the workplace

Unlocking the possibilities of AI with Mondee’s Kymber Lowe

Technology

ChatGPT's first birthday: A year in review

Machine Learning

Sam Altman returns as CEO after being ousted by OpenAI

Machine Learning