Samsung teams up with NAVER for hyperscale AI semiconductors

The two companies intend to combine semiconductor design with proven AI capabilities to maximise the speed and power efficiency of large-scale AI models

Samsung Electronics and internet company NAVER Corporation have announced a collaboration to develop semiconductor solutions tailored for hyperscale artificial intelligence (AI) models. 

Leveraging Samsung’s next-generation memory technologies, the companies intend to combine hardware and software resources to accelerate the handling of massive AI workloads.

Recent advances in hyperscale AI have led to an exponential growth in data volumes that need to be processed, says Samsung. However, the performance and efficiency limitations of current computing systems pose significant challenges in meeting these heavy computational requirements, fueling the need for new AI-optimised semiconductor solutions.

Developing such solutions requires an extensive convergence of semiconductor and AI disciplines. Samsung is combining its semiconductor design and manufacturing expertise with NAVER’s experience in the development and verification of AI algorithms and AI-driven services to create solutions that significantly improve the performance and power efficiency of large-scale AI.

Optimising memory technologies for large-scale AI systems

Samsung has previously introduced memory and storage that support high-speed data processing in AI applications, from computational storage (SmartSSD) and PIM-enabled high bandwidth memory (HBM-PIM) to next-generation memory supporting the Compute Express Link (CXL) interface. Samsung will now join with NAVER to optimise these memory technologies in advancing large-scale AI systems.

NAVER will continue to refine HyperCLOVA, a hyperscale language model with over 200 billion parameters, while improving its compression algorithms to create a more simplified model that significantly increases computation efficiency.

“Through our collaboration with NAVER, we will develop cutting-edge semiconductor solutions to solve the memory bottleneck in large-scale AI systems,” says Jinman Han, Executive Vice President of Memory Global Sales & Marketing at Samsung Electronics. “With tailored solutions that reflect the most pressing needs of AI service providers and users, we are committed to broadening our market-leading memory lineup including computational storage, PIM and more, to fully accommodate the ever-increasing scale of data.”

“Combining our acquired knowledge and know-how from HyperCLOVA with Samsung’s semiconductor manufacturing prowess, we believe we can create an entirely new class of solutions that can better tackle the challenges of today’s AI technologies,” says Suk Geun Chung, Head of NAVER CLOVA CIC. “We look forward to broadening our AI capabilities and bolstering our edge in AI competitiveness through this strategic partnership.”

Share

Featured Articles

Should Tech Leaders be Concerned About the Power of AI?

With insights from Blackstone CEO Steve Schwarzman, we consider if tech leaders are right to be anxious about AI innovation and if regulation is necessary

Andrew Ng Joins Amazon Board to Support Enterprise AI

In the wake of Andrew Ng being appointed Amazon's Board of Directors, we consider his career from education towards artificial general intelligence (AGI)

GPT-4 Turbo: OpenAI Enhances ChatGPT AI Model for Developers

OpenAI announces updates for its GPT-4 Turbo model to improve efficiencies for AI developers and to remain competitive in a changing business landscape

Meta Launches AI Tools to Protect Against Online Image Abuse

AI Applications

Microsoft in Japan: Investing in AI Skills to Boost Future

Cloud & Infrastructure

Microsoft to Open New Hub to Advance State-of-the-Art AI

AI Strategy