Remove biases with AI recruiting

By Marcus Beaver, UKI Country Leader at Alight Solutions
Marcus Beaver, UKI Country Leader at Alight Solutions discusses the role of AI in recruitment and how human bias are affecting this process

Recruiting new talent can be a draining, long-winded process for companies, especially in the digital world we live in. Employers need to attract talent, assess candidates and communicate with them, do background checks, negotiate job offers, manage onboarding, to then hire, retain, nurture and foster loyalty. All these steps are time-consuming, and costly for a company, especially when we consider that sometimes, the person hired ends up leaving the company after not very long.

Artificial Intelligence (AI) tools can be a great solution to minimise the time and energy spent in recruitment processes. From faster data processing to the removal of unconscious biases, AI tools have their advantages, but not without their cons. So how can these systems truly help an organisation hire better and faster without compromising the quality of talent recruited?

AI in recruitment

AI-driven recruiting software reduces costs by processing large volumes of data quickly. AI also allows for more engaging and user-friendly interactions between hirers and candidates, automated high volume administrative tasks, elevated employer brand and adherence to diversity, inclusion, and equality practices. A global survey conducted by Korn Ferry found that 63% of acquisition professionals reported an improvement in hiring practices since implementing AI tools for the effect.

The biggest advantage of recruiting with AI is the time saved. Talent acquisition teams typically have 30 – 40 open job requisitions at any one time, according to the Society for Human Resource Management. Most of these receive upwards of 250 resumes, each taking recruiters an average of 23 hours to sift through. All these hours could be spent improving HR processes, strategising ways to improve systems in place within the company, as well as planning and conducting more thorough evaluations at the interview stage.

There are concerns that AI tools could infringe privacy rights, such as facial recognition, which could be considered unethical and beyond the needs required to appoint candidates. Other concerns include a lack of human interaction and human judgement, and biases creeping in at all stages from programming to managing algorithms. These partialities, however, have a human origin and shouldn’t necessarily be blamed on the software. It’s who builds it that has prejudices, not the algorithms.

Human biases hurting recruitment processes

According to the Cognitive Bias Codex, humans are subject to more than 180 cognitive biases in traditional hiring practices, from reviewing an applicant’s information, to interview selection, to conducting interviews and selecting a candidate for the position. With so many unconscious biases actively influencing our decisions daily, it’s important to consider solutions that might help make better and fairer decisions.

Although AI is helpful in speeding up the hiring process, it’s not without its faults if implemented incorrectly. There can be biases in AI when there isn’t a representative data set or a number of characteristics agreed upon. When AI is created, humans have to code it. It’s in this part of the process that human biases can be built into the algorithm. However, this will show up in patterns found in legacy recruitment data. The longer the talent acquisition history, the more AI-powered process can learn, preferences removed, and function ‘re-educated’.

There are also solutions to circumvent these issues. Having two algorithms, for example, working in parallel can mitigate bias. The first algorithm selects a candidate based on a specific data set skills required. The second becomes responsible for underlying sensitive attributes, such as gender, age, race, and academic qualifications. Another useful technique is post-processing, designed to take inherently biased results, and recast them fairly and accurately. For example, if the distribution of gender isn’t equal, one gender might be ranked higher, unfairly – in that case, the system readjusts, so each applicant has the same chance to be reviewed and considered.

AI for success

As with any business transformation project, there is no one-size-fits-all solution for automating the end-to-end talent acquisition and management process. The success of a project is dependent on its planning and design. Expert counsel is essential before making a large modification, especially when integrating human and machine features.

All parties must work together to build an AI solution for talent. The individual methods must be prioritised so that common ground can be identified, and the core system can then be planned and built. This could take the form of staged digitisation or the use of an AI solution. Budget is important, but so is deciding on a model that works for the company and the type of talent it requires.

The AI solution's technical underpinning should be transparent, well-explained, and extensible to different enterprise systems. An AI system is a good solution for making the recruiting process more efficient, as long as all conditions are met.

Share
Share

Featured Articles

Catching up with Sophia: gender bias in AI

Gender bias in AI is discussed often. Here, Hanson Robotics’ robot, Sophia, shares how this bias is experienced by humans and robots alike

Artificial intelligence could be a stroke of genius for golf

AI has been besting humans for years with powerful expert systems like Deep Blue and AlphaGo. Now AIs are taking up golf – but this time to help humans win

Are we ready to hand humanity’s future over to AI?

AI could contribute up to US$5.2tn to the global economy by the end of the decade, all in the name of sustainability

A watershed moment: feeding the world with AgriTech

Data & Analytics

5 minutes with: Vikram Saxena, CEO, BetterCommerce

AI Applications

AI helps handle post-Covid ecommerce explosion, says EY

AI Applications