Artificial intelligence (AI) put to work in data centres can deliver a valuable return on investment and have a significant impact on key technologies including cloud services and 5G mobile networks, according to research.
A new report - Artificial Intelligence: Charting the Way Forward for AI: 2022 Survey of IT Leaders and Service Providers on AI Deployment - was released this week by hybrid IT solutions provider CoreSite in collaboration with the market research and competitive analysis group Heavy Reading and Ericsson.
Researchers examined industry trends and future infrastructure requirements to deploy artificial intelligence and highlights key business justifications for service providers to deploy AI in data centres. Benefits include better customer experience and retention, improved network performance and opportunities for new revenues and cost savings.
According to Heavy Reading, investments in deploying AI in data centres and networks can deliver a valuable return on investment and have a significant impact on key technologies being deployed today, including cloud services and 5G mobile networks.
Vast majority of companies to increase AI and ML work
AI usage is expected to accelerate rapidly, researchers found, with the overwhelming majority of companies surveyed said to be increasing the use of AI and machine learning (ML). Over the next five years, 82 per cent of respondents expect their company’s use of AI to increase.
The report also discovered Hybrid IT is a popular solution businesses leverage to deploy AI, combining the power of on-premises and colocation data centres. The majority of businesses surveyed are planning to deploy AI in a hybrid combination of on- and off-premises data centres. The results also suggest a shift from on-premises to off-premises locations, particularly for mobile network operators, say researchers.
The criticality of low latency networks, interconnection and cloud networking to deploying AI and AI infrastructure architecture was also outlined, with more than 80 per cent of respondents saying low-latency networks, interconnection, and cloud networking are either critical or very important as part of AI/ML infrastructure architectures.
“Low latency networks and artificial-intelligence-as-a-service (AIaaS) will play a significant part in AI/ML infrastructure architecture,” says Heavy Reading’s Analyst-at-Large Simon Stanley. “Heavy Reading expects that the industry will continue to shift AI workloads toward off-premises colocation data centres and edge data centres within a hybrid infrastructure architecture. With the right investments in AI capabilities and resources, service providers can deploy AI in data centres and networks and deliver benefits across many market areas.”