Samsung Producing Industry’s Thinnest DRAM for On-Device AI

The continued pursuit of miniaturisation is part of a push to create additional space within mobile devices and facilitate better airflow
This chip's size will allow great airflow for mobile phones and allow the high-performance applications that on-Device AI demands

Samsung has shot out of the stocks with its ambitions to be an AI leader by announcing the mass production of the industry's thinnest chip for on-device AI.

Coming mere months after it announced that its AI-Era Vision coincides with its new chip tech, this new chip is aimed to be implemented onboard mobile phones, one of Samsung’s main market.

The ultra-slim LPDDR5X DRAM package comes in at 12-nanometer (nm)-class, yet its 12-gigabyte (GB) and 16GB LPDDR5X DRAM package allows it to wield high-density memory solutions.

This new chip is aimed to be implemented onboard mobile phones

The continued pursuit of miniaturisation is part of a push to create additional space within mobile devices and facilitate better airflow. This supports easier thermal control, a factor that is becoming increasingly critical especially for high-performance applications with advanced features such as on-device AI.

Making of a chip

Samsung's new LPDDR5X DRAM packages represent a significant advancement in memory technology, offering not only superior LPDDR performance but also advanced thermal management in an ultra-compact package.

This reduction in thickness was achieved through optimised printed circuit board (PCB) and epoxy moulding compound (EMC) techniques, as well as a specialised back-lapping process.

Samsung's commitment to innovation in this space is evident in its plans to continue expanding the low-power DRAM market.

The company intends to supply its 0.65mm LPDDR5X DRAM to mobile processor makers and device manufacturers, with a focus on developing even more compact 6-layer 24GB and 8-layer 32GB modules for future devices.

Samsung’s AI ambitions 

Samsung's announcement of the industry's thinnest LPDDR5X DRAM packages comes at a pivotal moment. The South Korean tech giant has made it clear that it sees chips being integral to its AI future. 

"At a time when numerous technologies are evolving around AI, the key to its implementation lies in high-performance, low-power semiconductors," said Dr. Siyoung Choi, President and Head of Foundry Business at Samsung Electronics. 

Whilst the likes Nvidia are leading the industry in terms of AI chips, Samsung is spending time developing AI memory solutions like HBM-PIM (High Bandwidth Memory - Processing-In-Memory) and CXL-PNM (Processing-Near-Memory), which integrate computation functions into memory semiconductors, enhancing AI processing capabilities.

Youtube Placeholder

To that end, the company has unveiled a comprehensive strategy to cement its position as a leader in the AI chip market.

At the heart of this strategy is Samsung's new AI Solutions platform, which leverages the strengths of its Foundry, Memory, and Advanced Package (AVP) businesses to provide customers with tailored, one-stop solutions for their AI needs. 

By leveraging its comprehensive capabilities across chip design, memory production, foundry, and packaging, Samsung is well-positioned to offer the tailored solutions necessary to power the AI revolution.

This holistic approach to chip and package co-design is precisely what Samsung believes will be the key to thriving in the transformative era of AI.

With its commitment to continuous improvement and a deep understanding of the evolving needs of the market, the demand for high-performance, low-power mobile memory solutions from Samsung may prove crucial in enabling the next generation of AI-powered devices. 

******

Make sure you check out the latest edition of AI Magazine and also sign up to our global conference series - Tech & AI LIVE 2024

******

AI Magazine is a BizClik brand

Share

Featured Articles

Anthropic Challenging OpenAI with Claude Enterprise Launch

Anthropic's launch of an Claude Enterprise Plan chatbot sees it challenge OpenAI's ChatGPT Enterprise and its monopoly on enterprise Gen AI use

Reshaping Retail with AI: Valtech’s Rosanne Barendrecht

AI Magazine speaks with Rosanne Barendrecht about how Gen AI is revolutionising retail operations and how retailers can harness AI’s full potential

Amazon and Covariant Partner to Boost AI-Powered Warehouses

Amazon will take some of Covariant's staff and use their robotic foundation AI models to improve the function of its array of extensive robotic systems

What Does the World’s First International AI Treaty Include?

AI Strategy

ThinQ ON: LG's AI Hub Will Make Smart Home Devices Smarter

AI Applications

Stability AI Stable Diffusion Launch to Advance Ethical AI

AI Applications