AI decisions require legal protection, warns BCS

By BizClik Admin
BCS, the Chartered Institute for IT, warns human reviews of AI decisions need legal protections

BCS, The Chartered Institute for IT, issued the warning in line with the launch of the ‘Data: A New Direction’ consultation from the Department for Digital, Culture, Media and Sport (DCMS).

Post-Brexit data regulations

The consultation aims to re-examine the UK’s data regulations post-Brexit. EU laws that were previously mandatory while the UK was part of the bloc, such as GDPR, will be looked at again to determine whether a better balance can be struck between data privacy and ensuring that innovation is not stifled at the same time.

Earlier this year the Culture Secretary, Oliver Dowden, said: “There’s an opportunity for us to set world-leading, gold standard data regulation which protects privacy, but does so in as light-touch a way as possible.”

The DCMS is now considering the removal of Article 22 of GDPR, which focuses specifically on the right to review fully automated decisions.

Examination of automated decisions

Dr Sam De Silva, Chair of BCS’ Law Specialist Group and a partner at law firm CMS, explained: 

“Article 22 is not an easy provision to interpret and there is danger in interpreting it in isolation like many have done.

“We still do need clarity on the rights someone has in the scenario where there is fully automated decision making which could have a significant impact on that individual.”

Artificial Intelligence systems are being used for increasingly critical decisions, including whether to offer loans or grant insurance claims. Given the unsolved issues with bias, there’s always the chance that discrimination could end up becoming automated.

One school of thought is that humans should always make the final decisions, especially ones that impact people’s lives. And BCS believes that human reviews of AI decisions should at least have legal protection.

“Protection of human review of fully automated decisions is currently in a piece of legislation dealing with personal data. If no personal data is involved the protection does not apply, but the decision could still have a life-changing impact on us,” added De Silva.

He went on: “For example, say an algorithm is created deciding whether you should get a vaccine. The data you need to enter into the system is likely to be DOB, ethnicity, and other things, but not name or anything which could identify you as the person.

“Based on the input, the decision could be that you’re not eligible for a vaccine. But any protections in the GDPR would not apply as there is no personal data.”

BCS says it welcomes that the government is consulting carefully prior to making any decision. The body says that it supports the consultation and will be gathering views from across its membership.

 

Share

Featured Articles

IBM's VP of Build on Where Embeddable AI Stands to Benefit

IBM EMEA's VP of Build Dawn Herndon explains what embeddable AI is and where its main use cases and benefits will come from

Davies Increasing AI Focus with First Group Chief AI Officer

Although the first Group Chief AI Officer role at the firm, the appointment of Paul O'Brien is one step in a long walk to building their AI strategy

Tech & AI LIVE New York: Speaker Announcement

Executives from Ping Identity, ServiceNow and Consumer Technology Association are announced to be joining the line-up at Tech & AI LIVE New York

MLOps: What Is It and How Can It Enhance Operations?

Machine Learning

Kyocera CISO Talks Good Data Security in the Age of Gen AI

Data & Analytics

Sony & AI Singapore Join to Build Language Diversity in LLMs

Machine Learning