Amazon unveils next-generation AI chip to rival Microsoft

AWS will begin offering its new Trainium2 training chips in 2024
The growing demand for AI-powered solutions demonstrates how businesses across all industries are embracing the transformative power of AI

The race for AI dominance continues, as the potential for AI to transform industries across the globe intensifies. AI is already being used in a number of industries to automate tasks, improve customer service, and make better decisions. 

The application of AI is only expected to grow in the years to come, and companies that can successfully harness the power of this technology will have a significant advantage in the digital economy.

What is driving the AI market?

Currently, the growing demand for AI-powered solutions means that organisations across all industries are recognising the value of AI and are looking for ways to incorporate it into business operations. As well as this, the increasing availability of data is fueling AI's development, with companies that have access to large datasets being able to train AI models that are more accurate and powerful.

The cost of computing power has been steadily declining, which is making it more affordable for companies to deploy AI-powered applications, further driving the demand for AI solutions. The increasing interest from governments and regulators means the potential of AI to address social and economic challenges is leading to increased investment in AI research and development.

A new chip to compete against the market

At the re:Invent 2023 conference in Las Vegas, Amazon Web Services (AWS) Chief Executive Adam Selipsky announced Trainium2, the second generation of chip designed for training AI systems. Selipsky said the new version is four times as fast as its predecessor while being twice as energy efficient.

As reported by AWS, in 2022, the company released a high-performance machine learning chip, Trainium, which significantly accelerates the training of generative AI models, reducing training time from months to weeks, or even days, in some cases. Trainium's ability to reduce both costs and energy consumption makes it an attractive choice for model development, offering potential savings of up to 50% in costs and 29% in energy consumption compared to comparable instances.

The release of the Trainium2 chip follows an announcement from Microsoft surrounding its own AI chip, Maia. The Trainium2 will also face competition from Alphabet's Google, which has made its Tensor Processing Unit (TPU) available to its cloud computing customers since 2018.

Selipsky explains that AWS will begin offering its new Trainium2 training chips in 2024. This surge in custom-designed chips reflects the intensifying competition to secure the computing power necessary for developing technologies such as large language models (LLMs).

Amazon's decision to delve into AI chip development reflects the growing significance of AI in the cloud computing landscape. The cloud computing firms are offering their chips as a complement to Nvidia, the market leader in AI chips whose products have faced supply chain constraints for the past year.

As AI applications become increasingly prevalent, cloud providers are under immense pressure to deliver the necessary computing power to support these applications effectively. Trainium2 serves as a testament to Amazon's commitment to addressing this growing demand and maintaining its competitive edge in the cloud computing market.


Make sure you check out the latest edition of AI Magazine and also sign up to our global conference series - Tech & AI LIVE 2024


AI Magazine is a BizClik brand


Featured Articles

Toshiba Takes Another Step to Ushering in Embodied AI

Toshiba's Cambridge Research Lab has announced two breakthroughs in Embodied AI alongside a new group to renew focus on the tech

Why AWS is Investing $230m in Credits for Gen AI Startups

Amazon is investing US$230m in AWS cloud credits to entice Gen AI startups to get onboard with using its cloud services

How Retrieval Augmented Generation (RAG) Enhances Gen AI

RAG is a technique that promises to improve the way Gen AI fetches answers and provide business with a more reliable use case for client-facing uses

Synechron’s Prag Jaodekar on the UK's AI Regulation Journey

AI Strategy

LGBTQ+ in AI: Vivienne Ming and the Human Power of AI

Machine Learning

Samsung’s AI-Era Vision Coincides With its New Chip Tech

AI Strategy