NVIDIA and Cloudera continue to advance AI technologies

Share
NVIDIA is committed to scaling the data centre to help enterprises stay ahead in generative AI developments
Both companies expand their partnership to offer more capabilities for GPUs, machine learning and AI, in efforts to enhance business digital transformation

Cloudera is continuing to support advanced NVIDIA AI technologies to offer the best-in-class applications for enterprise digital transformation. 

Public and private cloud environments are to be enabled in the company’s support for key NVIDIA technologies, having been designed to help enable customers to efficiently build and deploy AI applications. Using NVIDIA GPUs, this new phase in Cloudera’s technology collaboration with NVIDIA will add multi-generational GPU capabilities for data engineering, machine learning and AI.

The partnership expansion comes in the midst of NVIDIA being impacted by the US AI chip restrictions on China. The country accounts for 25% of the company’s chip revenues. 

Working to accelerate AI and machine learning workloads

Cloudera Machine Learning (CML) is a leading service of the Cloudera Data Platform that seeks to empower enterprises to create their own AI applications. By utilising their own proprietary data assets to create secure and accurate responses, the company helps customers unlock the potential of open-source large language models (LLMs).

Speaking on Data Platform, Cloudera’s Anthony Behan told AI Magazine: “The Cloudera Data Platform is there to help customers drive innovation” and that it offers a “selection of purpose-built data services allowing expansion across the data lifecycle, from the edge to AI, whether that’s streaming massive data volumes, or deploying and monitoring next generation AI and ML models.”

Cloudera’s CML service is now turning its attention to support the NVIDIA H100 GPU in public clouds and in data centres. This next-generation acceleration aims to empower Cloudera's data platform, enable faster insights and more efficient generative AI workloads. As a result, the ability to fine-tune models on larger datasets and to host larger models in production could emerge. 

The enterprise-grade security and governance of CML means that businesses can better leverage the power of NVIDIA GPUs without compromising on data security.

NVIDIA partnerships continue to enhance global AI operations

Another key benefit is users gaining an enhanced ability to accelerate data pipelines with GPUs in Cloudera’s private cloud. WIth Cloudera Data Engineering, users can build production-ready data pipelines from various sources. 

With NVIDIA Spark RAPIDS integration in CDE, extracting, transforming, and loading (ETL) workloads can now be accelerated without the need to refactor. Existing Spark ETL applications can seamlessly be GPU-accelerated by a factor of 7x overall and up to 16x on select queries compared to standard CPUs, allowing NVIDIA customers to better take advantage of GPUs in upstream data processing pipelines. 

As a result, the hope is that utilisation of these GPUs is increased and can offer a higher return on investment.

“GPU acceleration applies to all phases of the AI application lifecycle - from data pipelines for ingestion and curation, data preparation, model development and tuning, to inference and model serving,” says Priyank Patel, Vice President of Product Management at Cloudera. 

“NVIDIA's leadership in AI computing perfectly complements Cloudera's leadership in data management, offering customers a complete solution to harness the power of GPUs across the entire AI lifecycle.”

NVIDIA is committed to scaling the data centre to help enterprises stay ahead in generative AI developments. This week, it also announced a partnership with Dell Technologies to serve as a ‘blueprint’ for the next generation of large-scale AI clusters.

Dell CEO Michael Dell says that their new designs will “help meet the demands of LLMs and GenAI applications with one of the fastest AI systems in the world” and feature “2,048 NVIDIA H100 Tensor Core GPUs, 256 Dell  PowerEdge XE9680  AI servers, and a robust Spectrum-X Ethernet AI network.”

******

For more insights into the world of AI - check out the latest edition of AI Magazine and be sure to follow us on LinkedIn & Twitter.

Other magazines that may be of interest - Technology Magazine | Cyber Magazine.

Please also check out our upcoming event - Sustainability LIVE Net Zero on 6 and 7 March 2024.

******

BizClik is a global provider of B2B digital media platforms that cover Executive Communities for CEOs, CFOs, CMOs, Sustainability leaders, Procurement & Supply Chain leaders, Technology & AI leaders, Cyber leaders, FinTech & InsurTech leaders as well as covering industries such as Manufacturing, Mining, Energy, EV, Construction, Healthcare and Food.

BizClik – based in London, Dubai, and New York – offers services such as content creation, advertising & sponsorship solutions, webinars & events.

Share

Featured Articles

Schneider Electric Enhances AI Data Centre Operations

Schneider Electric teams with Nvidia to advance AI data centres, whilst emphasising global sustainability in energy management

How Can AI Firms Pay Publishers? Perplexity Has a Plan

AI search firm Perplexity extends its content licensing programme to 14 new media partners, offering revenue share and API access for publisher content

PwC and AWS Join Forces on Enterprise AI Controls System

Professional services firm PwC and AWS collaborate on automated reasoning tools to reduce AI hallucination risk in regulated sectors

How Amazon Nova is Redefining AI for Enterprise Solutions

AI Strategy

MHP Study: AI Reshapes Global Auto Industry Trust Landscape

AI Strategy

AWS & Philips: What a Cloud Partnership Brings Healthcare

AI Applications