Dell and Interxion: managing the mass influx of data with AI

As more and more data is generated with the introduction of connected devices, Dell and Interxion explore how AI can manage the predicted influx of data

Coined over ten years ago, the term Data Gravity refers to the attraction between data and applications. Just as with the law of gravity, data gravitates towards applications. Now one of the most high-profile trends in the data centre and cloud industries, Data Gravity impacts businesses significantly and, as enterprises feel a more significant pull towards data, new technologies will come into play.

As with a number of industries, the data centre industry has woken up to the power of artificial intelligence (AI) and many colocation providers are reaping the benefits of this multifaceted technology.

Reflecting on the impact already made, Tim Loake - VP of Infrastructure Solutions Group, UK, at Dell Technologies - said: “By far the biggest impact is in the Business Value delivery space; AI technology is becoming a necessity for many data-driven businesses. The implementation of AI allows businesses to deliver increased insights, provide better forecasts, identify patterns and anomalies, and ultimately improve performance.  It also brings with it significant compute demands to process that data, often in real-time, requiring ever-higher levels of service provision from our data centres.”

Interestingly, Lex Coors, Chief Data Centre Technology and Engineering Officer at Interxion, is more sceptical about the capabilities of AI: “AI has not yet impacted the industry as true AI does not yet exist. For true AI – and we should not take less as an objective – we need emotion, intuition and feeling.”

He does argue that intelligent machine learning (IML) is what we are seeing in the technology space and outlines two ways this is benefitting the data centre industry:

  • “Optimised the power usage effectiveness (PUE) of data centres
  • Cutting the maintenance costs by means of ‘Condition-based Maintenance’, where analysis of wear and tear and run-time data identifies the maintenance needed.”

AI and the Zettabyte era

Along with Data Gravity and the influx of data-connected devices that are generating information for enterprises, businesses also need to address challenges brought about by the Zettabyte era.

The way devices connect with each other, the way data flows is changing and enterprises are beginning to feel the effects - particularly as the impact of the internet of things (IoT) becomes more significant

By 2025, connected devices alone will generate an estimated 79 zettabytes of information. If you take into consideration that, in 2016, the entire volume of all data on earth amounted to just 18 zettabytes, the scale of this growth becomes astounding.

Commenting on this, Loake said: “The data decade is dramatically reshaping lives, reinventing the way we work and live. IDC predicts 175 zettabytes of data worldwide by 2025. Leveraging that data, identifying patterns and deriving value from it is critical for enterprises.”

This is where AI comes into play: the technology has the ability to transform the way we gather, store, and use data and, as a result, enables businesses of all sizes to answer bigger questions and make more discoveries, as well as keep up with the pace of change of competitors.

“As the volume of data grows, organisations need to find a way to effectively manage and process that data, as well as derive actionable insights from it. Accelerating intelligent, automated outcomes means putting that data first, managing the data lifecycle and using that data to inform decision making. Within the next few years, IDC expects AI to begin permeating business processes for most enterprises. Essentially, more data will drive better products and services, improve customer experience, and create ever more relevant business insights,” added Loake.

Concerned about the impact of the Zettabyte era if AI isn’t developed to its full potential, Coors explained: “The Zettabyte era as such could further enhance and ease life for humans, but there is a caveat that it could only further increase the production of work and this will then lead to burn out. In this case, IML will not be able to protect the workforce as it installs programmes set to further the optimisation of the company.”

Regardless, he continued explaining the profound impact that true AI could have with this predicted influx of data: “AI will have the intuition that might be needed to be able to balance each and every aspect brought about by the Zettabyte era. It will be able to realise the emotion of the situation,show feeling for the personal situation of the employee and help to make decisions – or give advice -  to balance company expectations and human health.”

Creating the right strategy to enable AI to cope with the influx of data

Loake urges that, as business models become more data-driven, enterprises develop an end-to-end AI strategy that is integrated across the underlying IT infrastructure: “Those effectively deploying AI will have their AI infrastructure distributed across edge, core, and cloud deployment locations. Getting the underlying infrastructure right is key to the long-term success of AI. To maximise ROI from AI, businesses need to move beyond proof of concepts (POCs) and get to production and scale.”

Echoing this, Coors also believes in the importance of a robust and rigid strategy when it comes to AI technologies: “In the case of true AI, companies should draft a first problem statement taking into consideration the company’s goals, objectives and social corporate view; AI will be able to provide businesses with a few options seen from various angles and so accurately explaining/predicting what the impact would be is important.”

This is particularly significant as less than 15% of AI models transition from proof of concepts (POC) to production, and even fewer are taken to operate at scale. To create the right foundation for a robust strategy, Loake outlined that there needs to be a focus on data culture, data quality and data privacy.

AI: saving time and money in the data centre industry

As the influx of data causes issues for the data centre industry, AI can control workload through automation. This could lead to the creation of several smaller, interconnected edge data centres, all of which would be managed by a single administrator. 

Concluding, Loake outlined the potential use cases of AI within the industry: “AI will save time and money by automating and optimising routine processes and tasks. AI will increase productivity and operational efficiency, by allowing businesses to make faster decisions based on outputs from cognitive technologies. AI systems, correctly deployed, will also help businesses avoid mistakes and reduce ‘human error’ – this is perhaps the biggest potential short-term benefit.”

Share

Featured Articles

AI Agenda at Paris 2024: Revolutionising the Olympic Games

We attended the IOC Olympic AI Agenda Launch for Olympic Games Paris 2024 to learn about its AI strategy and enterprise partnerships to transform sports

Who is Gurdeep Singh Pall? Qualtrics’ AI Strategy President

Qualtrics has appointed Microsoft veteran Gurdeep Singh Pall as its new President of AI Strategy to transform the company’s AI offerings for customers

Should Tech Leaders be Concerned About the Power of AI?

With insights from Blackstone CEO Steve Schwarzman, we consider if tech leaders are right to be anxious about AI innovation and if regulation is necessary

Andrew Ng Joins Amazon Board to Support Enterprise AI

Machine Learning

GPT-4 Turbo: OpenAI Enhances ChatGPT AI Model for Developers

Machine Learning

Meta Launches AI Tools to Protect Against Online Image Abuse

AI Applications