AI at the edge will usher in the hyperconnected era
We’re in the next phase of the digital revolution, where the data centre has stretched to the network edge and where myriad IoT devices process data with the aid of AI. Edge devices combined with AI/ML are giving rise to the next Industrial Revolution, characterised by the decentralisation of computing, communications, and business processes.
Big data and cloud are key drivers behind this revolutionary edge movement. Every day, huge amounts of data are generated, streamed, and moved in cloud environments, from the smart refrigerator that knows you’re out of milk, to cloud gaming, patient monitoring systems in hospitals, retail inventory management systems, and a myriad other enterprise cloud applications.
Edge computing reduces cost and latency by enabling data to be processed where it’s generated and consumed. Processing data locally instead of sending to the cloud increases security, privacy, and reliability, and lets organisations more quickly scale up applications.
That’s why by 2025, Gartner predicts 75% of that data will be created at the edge, in factories, hospitals, retail stores and cities - much of it processed, stored, analysed, and acted on at the edge. Over 50% of enterprise-managed data will be created and processed outside the data centre or cloud.
Infrastructure and operations leaders must plan for this explosive data growth, protecting their enterprise from edge growing pains in security and connectivity, and preparing for changing edge computing use cases.
To do this, they will need AI supported by Kubernetes.
AI at the Edge: A Disruptive Force
AI is the century’s most disruptive technology: McKinsey’s Tech Trends Outlook 2022 sized the global AI opportunity at $10 trillion to $15 trillion. Its task automation and data analysis on a previously impossible scale is already improving productivity for lots of enterprises.
But many believe the biggest gains will happen by combining AI with edge computing for faster data processing and responses. Embedding AI into IoT endpoints, gateways, and edge servers can significantly improve operational efficiency, customer service, and speed of decision-making.
AI at the edge unleashes innovation and optimises processes across industries, enabling timely understanding of customer data for personalization of apps and customer service, real-time automation in manufacturing environments, and rapid development and testing of data models.
AI already provides intelligence for self-checkout lanes and wearable devices, helps banks run investment analyses, and improves crop yields through IoT sensors in the field. It’s an underlying technology for cloud-based subscriptions for software, contact centres, and any “as a service” platforms. It is the intelligence behind recommendation engines and chatbots (as we’ve recently seen with ChatGPT’s popularity), and the critical technology for making self-driving cars a reality.
Easing Edge Development with Kubernetes
But building edge AI applications to serve all these use cases is difficult. There are many forms of data that must be handled across multiple steps. The apps need to run across distributed platforms and be kept current, often in situations requiring continuous updating.
Many edge AI developers are turning to containers to increase efficiency, automate workflows, speed deployment and updates, and improve scalability and security. Most are using cloud-native, open-source Kubernetes to orchestrate their containers.
The State of Native Cloud Development 2021 report found 76% of developers working on edge computing applications using containers; 63% were using Kubernetes.
Using Kubernetes speeds deployment of new applications by 10 times or more, while making development, packaging, and deployment predictable and consistent. It lets AI run across different platforms, toolsets, and chipsets, and provides for continuous improvement through exponentially faster large-scale AI updates. Kubernetes can also optimise workload placement to improve edge AI performance.
In a survey conducted by Vanson Bourne, 40% of respondents said AI and machine learning were their most popular workloads for Kubernetes, and 88% agreed that in the next two years Kubernetes would be the platform of choice for running AI and machine learning workloads.
Emerging Trends for AI at the Edge
Over the next few years, we can expect AI at the edge use cases to expand substantially, driven by a number of key trends, including edge-native applications.
Most of today’s enterprise edge use cases are primarily cloud use cases adapted to utilise speed and cost savings of edge processing. In the enterprise, edge native applications increase edge reliability, enable seamless app mobility, increase security and privacy protection, and further reduce latency.
Development has begun on these, but the Linux Foundation believes long-term demand depends on maturation of bandwidth-heavy and data-hungry technologies, such as augmented and virtual reality, which would benefit greatly from the reduced latency and data congestion and improved mobility of 5G at the edge.
Welcome to the Hyperconnected Era
Digital technology has advanced at a dizzying pace through disruptive technologies like the microprocessor, PC, Internet, mobile devices, cloud computing, AI, machine learning, edge, and IoT.
Now we see the data centre, cloud, and edge converging into an interconnected whole. As edge and AI technologies mature, they will bring us closer to a world where business innovation rivals the e-business revolution unleashed by the Internet in the 1990s.