Is the future of machine learning open source?

By Tilly Kenyon
Amazon Web Services partners with Hugging Face to simplify AI-based natural language processing...

Voice enabled digital assistants surround us, from our smartphones to our speakers, and we often forget about the integrated technology that enables these devices to recognise us. 

These capabilities rely on a technology called Natural Language Processing (NLP) that, put simply, trains machine learning models on data sets of text and speech to recognise words, understand the context and structure they are present in, and also derive meaning from the presentation in order to take some sort of action. NLP has been worked on by engineers for years to help refine and make the technology more accurate. It has expanded the number of languages, dialects and accents it can recognise.

Hugging Face and Amazon Web Services (AWS) have now partnered to bring over 7,000 NLP models to Amazon SageMaker with accelerated inference and distributed training. 

What is Hugging Face?

Founded in 2016, Hugging Face is a global leader in open-source machine learning (ML), with head-quarters in New York and Paris. It is well known for its Transformers library, which makes it easier to access a range of popular natural language neural networks trained on AI frameworks such as PyTorch and TensorFlow. Transformers provide thousands of "pre-trained models to perform tasks on texts, such as classification, information extraction, question answering, summarization, translation and text generation in more than 100 languages."

“Hugging Face is a resource for startups and other businesses around the world. Our transformers can help them build virtually any natural language processing application at a fraction of the time, cost, and complexity they’d could achieve their own, helping organizations take their solutions to market quickly,” said Clement Delangue, CEO of Hugging Face.

Why does AWS' involvement matter? 

The partnership between AWS and Hugging Face will bring more than 7,000 NLP models to Amazon SageMaker, an ML service used to build, train and deploy machine learning models. 

Hugging Face announced a couple of new services which are built using Amazon SageMaker including AutoNLP, this provides an automatic was to train and deploy state of the art NLP models for different tasks, and the Accelerated Inference API, which is used to build, train and deploy machine learning models in the cloud and at the edge. 

The startup has also chosen AWS as its preferred cloud provider. This collaboration will allow customers from both AWS and Hugging Face to be able to easily train their language models and "take advantage of everything from text generation to summarization to translation to conversational chat bots, reducing the impacts of language barriers and lack of internal machine learning expertise on a business’s ability to expand."

What's the future of open source and AI? 

Open source has had an undeniable impact on the IT industry over the past few years, and when it comes to AI and machine learning, open source technology is all about high speed innovation. 

The influx of new technologies such as machine learning, AI and robotic developments has allowed developers to successfully solve testing and other issues by using the open source community and learning from some of the best developers. 

There is no doubt that in the future technology will continue developing, and it is likely that AI and open source technologies will continue to grow alongside.


Featured Articles

Toshiba Takes Another Step to Ushering in Embodied AI

Toshiba's Cambridge Research Lab has announced two breakthroughs in Embodied AI alongside a new group to renew focus on the tech

Why AWS is Investing $230m in Credits for Gen AI Startups

Amazon is investing US$230m in AWS cloud credits to entice Gen AI startups to get onboard with using its cloud services

How Retrieval Augmented Generation (RAG) Enhances Gen AI

RAG is a technique that promises to improve the way Gen AI fetches answers and provide business with a more reliable use case for client-facing uses

Synechron’s Prag Jaodekar on the UK's AI Regulation Journey

AI Strategy

LGBTQ+ in AI: Vivienne Ming and the Human Power of AI

Machine Learning

Samsung’s AI-Era Vision Coincides With its New Chip Tech

AI Strategy