Facebook announces project to solve AI first-person views

By Laura Berrill
Facebook announces Ego4D, a project aimed at solving AI research challenges in “egocentric perception,” or first-person views

The company says the aim of Ego4D is to teach AI systems to comprehend and interact with the world like humans do, as opposed to the third-person - the  way that most AI currently does.

Computer vision models have historically learned from millions of photos and videos captured in third-person. Facebook believes next-generation AI systems might need to learn from a different kind of data: i.e. videos that show the world from the center of the action, in order to achieve egocentric perception.

Global first-person video research

Ego4D brings together a consortium of universities and labs across nine countries, which collected more than 2,200 hours of first-person video featuring more than 700 participants in 73 cities going about their daily lives. It was funded through academic grants to each of the participating universities. Researchers from Facebook Reality Labs - Facebook’s AR and VR-focused research division - also used Vuzix Blade smartglasses to collect an additional 400 hours of first-person video data in staged environments in research labs.

Collecting the data

Kristen Grauman, lead research scientist at Facebook, said of the project: “For AI systems to interact with the world the way we do, the AI field needs to evolve to an entirely new paradigm of first-person perception. That means teaching AI to understand daily life activities through human eyes in the context of real-time motion, interaction, and multisensory observations.”

Ego4D is designed to tackle challenges related to embodied AI, which is a field aiming to develop AI systems with a physical or virtual embodiment, like robots. The researchers hope to improve the performance of AI systems like chatbots, robots, autonomous vehicles, and even smart glasses that interact with their environments, people and other AI.

And in an effort to diversify Ego4D, Facebook says that participants were recruited via word of mouth, ads and community bulletin boards from the U.K., Italy, India, Japan, Saudi Arabia, Singapore, and the U.S. across varying ages, professions and genders (45% were female, one identified as nonbinary, and three preferred not to say a gender). The company also says it is working on expanding the project to incorporate data from partners in additional countries, including Colombia and Rwanda. However, Facebook declined to say whether it took into account accessibility and users with major mobility issues. 

The company went on to say the Ego4D university consortium will release its data in the coming months. It also plans to launch a challenge early next year inviting researchers to develop AI that understands the first-person perspectives of daily activities.

 

Share

Featured Articles

ChatGPT one year on: adoption, transformation, regulation

OpenAI’s ChatGPT turns one year old today and, with much global conversation taking place, businesses continue to seek the benefits of generative AI

IBM continues to address climate change via geospatial AI

Ahead of COP28, IBM announce collaborations with NASA & Mohamed Bin Zayed University of AI to expand upon IBM geospatial AI tech and monitor climate change

Agility Robotics: Digit, Amazon and workplace efficiencies

From Amazon partnerships, to creating humanoid robots who can pick themselves up off the ground, Agility Robotics continues to innovate in the workplace

Unlocking the possibilities of AI with Mondee’s Kymber Lowe

Technology

ChatGPT's first birthday: A year in review

Machine Learning

Sam Altman returns as CEO after being ousted by OpenAI

Machine Learning