Is the future of machine learning open source?

By Tilly Kenyon
Amazon Web Services partners with Hugging Face to simplify AI-based natural language processing...

Voice enabled digital assistants surround us, from our smartphones to our speakers, and we often forget about the integrated technology that enables these devices to recognise us. 

These capabilities rely on a technology called Natural Language Processing (NLP) that, put simply, trains machine learning models on data sets of text and speech to recognise words, understand the context and structure they are present in, and also derive meaning from the presentation in order to take some sort of action. NLP has been worked on by engineers for years to help refine and make the technology more accurate. It has expanded the number of languages, dialects and accents it can recognise.

Hugging Face and Amazon Web Services (AWS) have now partnered to bring over 7,000 NLP models to Amazon SageMaker with accelerated inference and distributed training. 

What is Hugging Face?

Founded in 2016, Hugging Face is a global leader in open-source machine learning (ML), with head-quarters in New York and Paris. It is well known for its Transformers library, which makes it easier to access a range of popular natural language neural networks trained on AI frameworks such as PyTorch and TensorFlow. Transformers provide thousands of "pre-trained models to perform tasks on texts, such as classification, information extraction, question answering, summarization, translation and text generation in more than 100 languages."

“Hugging Face is a resource for startups and other businesses around the world. Our transformers can help them build virtually any natural language processing application at a fraction of the time, cost, and complexity they’d could achieve their own, helping organizations take their solutions to market quickly,” said Clement Delangue, CEO of Hugging Face.

Why does AWS' involvement matter? 

The partnership between AWS and Hugging Face will bring more than 7,000 NLP models to Amazon SageMaker, an ML service used to build, train and deploy machine learning models. 

Hugging Face announced a couple of new services which are built using Amazon SageMaker including AutoNLP, this provides an automatic was to train and deploy state of the art NLP models for different tasks, and the Accelerated Inference API, which is used to build, train and deploy machine learning models in the cloud and at the edge. 

The startup has also chosen AWS as its preferred cloud provider. This collaboration will allow customers from both AWS and Hugging Face to be able to easily train their language models and "take advantage of everything from text generation to summarization to translation to conversational chat bots, reducing the impacts of language barriers and lack of internal machine learning expertise on a business’s ability to expand."

What's the future of open source and AI? 

Open source has had an undeniable impact on the IT industry over the past few years, and when it comes to AI and machine learning, open source technology is all about high speed innovation. 

The influx of new technologies such as machine learning, AI and robotic developments has allowed developers to successfully solve testing and other issues by using the open source community and learning from some of the best developers. 

There is no doubt that in the future technology will continue developing, and it is likely that AI and open source technologies will continue to grow alongside.

Share

Featured Articles

Should Tech Leaders be Concerned About the Power of AI?

With insights from Blackstone CEO Steve Schwarzman, we consider if tech leaders are right to be anxious about AI innovation and if regulation is necessary

Andrew Ng Joins Amazon Board to Support Enterprise AI

In the wake of Andrew Ng being appointed Amazon's Board of Directors, we consider his career from education towards artificial general intelligence (AGI)

GPT-4 Turbo: OpenAI Enhances ChatGPT AI Model for Developers

OpenAI announces updates for its GPT-4 Turbo model to improve efficiencies for AI developers and to remain competitive in a changing business landscape

Meta Launches AI Tools to Protect Against Online Image Abuse

AI Applications

Microsoft in Japan: Investing in AI Skills to Boost Future

Cloud & Infrastructure

Microsoft to Open New Hub to Advance State-of-the-Art AI

AI Strategy