Ransomware Top Use Case for Autonomous Response Technology
Darktrace, a cybersecurity AI company, announced that ransomware is the top use case of its Autonomous Response technology, as organisations face the increased threat of machine-speed attacks.
Antigena Network is Darktrace’s Autonomous Response technology for enterprises. Powered by self-learning cyber AI, Antigena Network instantly interrupts attacks across cloud services, IoT, and the corporate network with surgical precision, even if the threat is novel or highly targeted. As ransomware attacks continue to pose an existential risk to organisations, Darktrace Antigena allows customers to take proportionate action to obstruct all strains of ransomware, both known and unknown, in real-time, which in turn avoids costly shutdowns and business disruption.
Powered by self-learning Cyber AI, Darktrace says Autonomous Response is ‘a world-first technology that rapidly neutralises a range of novel cyber-attacks by taking highly targeted actions, while allowing normal business operations to continue as usual. Its self-learning technology isolates only the unusual data encryption activity associated with ransomware.’
“The threat of ransomware was a key driver in our adoption of Autonomous Response technology,” commented Leon Shepherd, Chief Information Officer, Ted Baker. “The ransomware that we are up against today moves too quickly for humans to contend with alone – the way we stay ahead is by having Darktrace AI fight back precisely and proportionately on our behalf.”
Darktrace has also extended its Autonomous Response capability to enhance the coverage of servers, allowing the AI to fight back against all forms of fast-moving attacks.
“For us, Autonomous Response technology combats the most sophisticated ransomware attacks out there and it does that within seconds of the threat emerging,” commented Abhay Raman, Chief Security Officer, Sun Life. “Crucially, the AI responds intelligently which allows us to continue normal business operations uninterrupted. This is the future of security.”
What is ransomware?
Ransomware is malware that employs encryption to hold a victim’s information at ransom. A user or organisation’s data is encrypted so that they cannot access files, databases, or applications. A ransom is then demanded to provide access.
A recent IBM and Ponemon Institute study looked at nearly 525 organisations in 17 countries and regions that sustained a breach last year and found that the average cost of a data breach in 2020 stood at $3.86 million. The report also found that the United States continued to experience the highest data breach costs, averaging $8.64 million per event. The healthcare industry sustained the highest costs, with each data breach incident costing about $7 million to recover from.
Cybersecurity Ventures expects global cybercrime costs to grow by 15 percent per year over the next five years, reaching $10.5 trillion USD annually by 2025, up from $3 trillion USD in 2015.
The advantages and disadvantages of AI in cloud computing
Cloud computing offers businesses more flexibility, agility, and cost savings by hosting data and applications in the cloud. AI capabilities are now combining with cloud computing and helping companies manage their data, look for patterns and insights in information, deliver customer experiences, and optimise workflows.
We take a look at some of the benefits and drawbacks of AI in cloud computing.
The benefits of AI in cloud computing
A major advantage of cloud computing is that it eliminates costs related to on-site data centers, such as hardware and maintenance. Those upfront costs can be restrictive with AI projects, but with cloud enterprises you can access these tools for a monthly fee, making research and development related costs more manageable. AI tools can also gain insights from the data and analyse it without human intervention, reducing staff costs.
AI is able to identify patterns and trends in large data sets. Using historical data, AI compares it to the most recent data, which provides IT teams with well-informed, data-backed intelligence. AI tools can also perform data analysis fast so enterprises can rapidly and efficiently address customer queries and issues. The observations and valuable advice gained from AI capabilities result in quicker and more accurate results.
Improved data management
AI enables extensive data management, and cloud computing maximises information security, making it possible to deal with massive amounts of data in a programmed manner to analyse them properly, allowing the business to leverage information that has been “mined” and filtered to meet each need. AI can also be used to transfer data between on-premises and cloud environments.
Businesses use AI-driven cloud computing to be more efficient and insight-driven. AI can automate repetitive tasks to boost productivity, and also perform data analysis without any human intervention. IT teams can also use AI to manage and monitor core workflows. IT teams can focus more on strategic operations while AI performs the mundane tasks.
With businesses deploying more applications in the cloud, security is crucial in order to keep data safe. IT teams can use different AI-powered network security tools which can track network traffic, they can flag issues, such as finding an anomaly.
The drawbacks of AI in cloud computing
Enterprises need to create privacy policies and secure all data when using AI in cloud computing. AI applications require a large amount of data, which can include consumer and vendor information. While some data can be anonymous and can't be tied to personally identifiable information, knowing who the data belongs to makes it more valuable. When sensitive information is used, data protection and compliance is a major concern.
IT teams use the internet to send raw data to the cloud service and recover processed data. Poor internet access can hinder the advantages of cloud-based machine learning algorithms, as cloud-based machine learning systems need consistent internet connectivity.
While processing data in the cloud is quicker than conventional computing, there is a time lag between transmitting data to the cloud and receiving responses. This is a significant issue when using machine learning algorithms for cloud servers, where prediction speed is one of the primary concerns.