With dangers of AI bias and unethical systems emerging more frequently as algorithms are developed, it is essential for businesses to consider acceptable use of AI, machine learning and GPT models in various contexts in a wide variety of settings.
Couchbase’s Head of Developer Experience, Perry Krug, discusses the importance of efficient, yet responsible, developments that steer AI away from bias and towards ethical practice.
Couchbase aims to simplify how developers deploy and consume modern applications worldwide. It prides itself on providing a fast and flexible cloud database platform Capella that allows organisations to quickly build applications that deliver premium experiences.
What steps can organisations take to prevent AI bias and how can it be solved?
Unfortunately, there is no easy way to eliminate AI bias, given how prevalent biases already are in human judgement. However, organisations can work to reduce bias in datasets by adopting a holistic view.
The most important first step for businesses is to mandate the removal of biases from datasets before AI models are put into production. High-quality, unbiased datasets must be prioritised above all else. In practice, this could mean removing protected information such as gender, age, nationality, and county of origin from datasets, preventing any correlation with other personal details that could result in certain demographics being excluded.
Why is it so important to tackle AI bias now?
Forrester predicts that spending on AI software will accelerate from US$33bn in 2021 to US$64bn in 2025 – that’s double the rate of the overall software market. And across businesses and society, the technology is already being used to manage workloads, support instant customer service requests, and provide detailed answers to prompts.
Clearly, as AI use cases and popularity grow, the impact of AI bias becomes a much bigger issue.
What are the implications of not properly dealing with AI bias?
A recent survey of British and American IT leaders found 36 percent of businesses have been negatively impacted by AI bias, with customers lost and revenue affected. Over half those surveyed also said a loss of customer trust is the main business risk related to AI bias.
Examples suggest these concerns are valid. A recent study found that AI models used to identify risk of liver disease from blood tests have double the chance of missing disease in women as in men. And research by the University of California at Berkeley found AI models designed to help allocate patient care gave black patients lower risk scores than white patients – despite comorbidity conditions among black patients being statistically more likely.
Applications that are built on biased datasets can result in inaccurate, sub-par or even harmful AI-powered outcomes, leading to severe outcomes for consumers and businesses.
How can organisations’ data scientists ensure ethical AI practices?
Data scientists have an important role to play– acting as the custodians of high-quality data and ethical practices. To tackle the specific challenges of underrepresentation of people, the best solution here is transparency.
By ensuring data is available to as many data scientists as possible, a more diverse group of people can sample the data and identify inherent biases. Using these experiences, AI models can be built that will train the trainer. Doing this would automate the inspection function as well, to make it possible to vet large volumes of data.
Please also check out our upcoming event - Sustainability LIVE in London on September 6-7 2023.
BizClik is a global provider of B2B digital media platforms that cover Executive Communities for CEOs, CFOs, CMOs, Sustainability Leaders, Procurement & Supply Chain Leaders, Technology & AI Leaders, Cyber Leaders, FinTech & InsurTech Leaders as well as covering industries such as Manufacturing, Mining, Energy, EV, Construction, Healthcare + Food & Drink.
BizClik – based in London, Dubai, and New York – offers services such as Content Creation, Advertising & Sponsorship Solutions, Webinars & Events.