Groq: The AI Chip Startup Worth US$2.8bn

Groq suddenly found itself in the midst of an AI chipmaking frenzy (Image: Groq)
Groq continues full steam ahead after its latest funding round, hoping its work on AI semiconductors will compete with tech giants like Nvidia and AMD

Startup Groq continues to upscale AI efforts after raising US$640m in a Series D funding round.

Led by Cisco Investments, Samsung Catalyst Fund and BlackRock Private Equity Partners, the funding round has brought Groq’s total valuation to US$2.8bn. The new funding will go towards boosting the company’s capacity for compute required to run complex AI systems, according to CEO Jonathan Ross.

Headquartered in California, Groq specialises in producing AI inference chips. These are a highly sought-after type of semiconductor that is designed to optimise speed and executive commands of pre-trained AI models.

What makes the company unique is that it focuses on AI deployment by accelerating chatbot response times.

Who is Groq? The story so far

Groq was first founded in 2016 by a group of former Google engineers led by Jonathan Ross, one of the designers of an AI accelerator, the Tensor Processing Unit (TPU)

The startup had a tepid start and managed to secure US$10m after its first funding round in 2017. In these early days, Jonathan’s vision was to design AI chips specifically for inference, which is the part of AI that mimics human reasoning. Inference AI enables a pre-trained machine learning model to apply what it has leant to new situations, thereby drawing conclusions from brand-new data.

However, business wasn’t booming right away. In an interview with Forbes, Jonathan Ross says: ““Groq nearly died many times ... We started Groq maybe a little bit early.”

Jonathan Ross, CEO of Groq

That was until the launch of ChatGPT in 2022 took the AI world by storm.

Groq suddenly found itself in the midst of an AI chipmaking frenzy, as businesses switched on to the true power of AI technology. The company, like its competitors, are reaping the rewards of the AI boom, as more enterprises strategise to harness AI.

Building inference technology for future AI use

Groq has made powerful advancements in AI inference technology. Its LPU AI inference technology is a hardware and software platform that is designed to deliver high compute speed, quality and energy efficiency.

This technology is particularly powerful when it comes to running large language models (LLMs) that offer the fastest inference capabilities. Likewise, its LPU Inference Engine addresses common bottlenecks in AI inference like compute density and memory bandwidth to make it more efficient than GPU-based solutions.

Its infrastructure is designed for both cloud and on-premises solutions as it seeks to offer scalable AI application support for organisations. For Groq, inference speed is vital for maximising AI value.

In addition to its rapid growth, Groq has also appointed former Intel and HP executive Stuart Pann as its Chief Operating Officer, in addition to Meta’s Chief AI Scientist Yann LeCun operating as the startup’s technical adviser.

Competitors power ahead with the global AI chip race

AI chips have fast-become a hot commodity across the global business landscape, as they are essential hardware to train and run AI models such as chatbots. With big technology organisations such as Google OpenAI wanting to get their hands on the latest chips, it is the chipmakers that are experiencing success.

Alongside the larger firms like Nvidia and AMD, Groq is just one of many startups vying for a place at the table. In particular, Groq is eager to compete with Nvidia’s dominant position in the chip industry, in the wake of the giant surpassing a US$3tn market cap.

Youtube Placeholder

Seeking to build upon its existing offerings, Groq claims that its language processing units (LPUs) are faster and more power efficient than rival chipmakers. In 2024, the company has already launched GroqCloud, its developer programme to attract developers towards renting access to its chips. Likewise, Groq acquired startup Definitive Intelligence to help with its growing cloud platform.

Globally, cloud service providers are eager to develop their own AI products, but are often limited due to high demand.

As a result, businesses are seeking alternative processors outside of the main giants - something that Groq is hoping to capitalise on one day.

“We’re nowhere near Nvidia yet,” Jonathan said in an interview with Forbes. “So all eyes are on us and it’s like, what are you going to do next?”

******

Make sure you check out the latest edition of AI Magazine and also sign up to our global conference series - Tech & AI LIVE 2024

******

AI Magazine is a BizClik brand 

Share

Featured Articles

Anthropic Challenging OpenAI with Claude Enterprise Launch

Anthropic's launch of an Claude Enterprise Plan chatbot sees it challenge OpenAI's ChatGPT Enterprise and its monopoly on enterprise Gen AI use

Reshaping Retail with AI: Valtech’s Rosanne Barendrecht

AI Magazine speaks with Rosanne Barendrecht about how Gen AI is revolutionising retail operations and how retailers can harness AI’s full potential

Amazon and Covariant Partner to Boost AI-Powered Warehouses

Amazon will take some of Covariant's staff and use their robotic foundation AI models to improve the function of its array of extensive robotic systems

What Does the World’s First International AI Treaty Include?

AI Strategy

ThinQ ON: LG's AI Hub Will Make Smart Home Devices Smarter

AI Applications

Stability AI Stable Diffusion Launch to Advance Ethical AI

AI Applications