AI Is Pulling Biases Into Talent Recruitment, but There Are Solutions Available to Remedy This

Current methods using artificial intelligence undermine attempts at diversifying

Digital tools aren't going to create new ways of doing things; they're going to learn from what already exists. - Credit by Photo Illustration: Amber McAden, Source: Getty Images
Headshot of James Oyedele

This year’s trends in HR and recruitment show many exciting developments that aim to redesign job hunting and employee selection.

AI is increasingly used to refine recruitment processes with machine efficiency and use of big data. Employee referrals, perhaps the opposite of AI automation, are another major trend. According to PayScale, more than a third of workers in the U.S. received a referral for their current job. Another study confirms increasing referral bonus amounts as companies seek to incentivize internal recommendations over external recruitment agencies.

Technology companies are using machine learning to power job search, using keywords and job history to auto-match candidates to vacancies; interviews by using chatbots, facial/vocal analysis software; and even talent rediscovery, where unsuccessful candidates can be connected to better-suited openings.

Algorithms rarely invent new ways of thinking and making decisions, but instead inherit and execute taught biases with alarming efficiency.

However, this nascent technology is already uncovering results that suggest AI may be perpetuating the biases that lead to the poor diversity levels seen throughout the tech industry. A great example of this can be found with Amazon, which in 2014 released a recruitment engine to assess resumes and identify the best candidates. They soon discovered, however, that the algorithm, relying on historical data, reproduced strong gender and ethnic biases. The machine quickly taught itself that only white men were suitable for the role, based on the data it had been fed. The program was finally scrapped in 2018.

We may lose sight of the fact that algorithms rarely invent new ways of thinking and making decisions, but instead inherit and execute taught biases with alarming efficiency. As The Wall Street Journal described it, “Although tech companies have stepped up efforts to recruit women and minorities, computer and software professionals who write AI programs are still largely white and male. … Thus, when engineers test algorithms on these databases with high numbers of people like themselves, they may work fine.”

In a similar way, internal referral recruitment strategies may inadvertently become exclusionary. The PayScale report states that 35 percent of workers in the technology industry were employed as a result of a referral. Out of these, 41 percent were referred by a family member or close friend. It’s easy to see how this could result in tech companies simply hiring people who come from the same demographics and backgrounds as their existing staff, thwarting any possibility of increased diversity.

Adjusting referral programs will be a difficult topic for HR teams. According to a range of studies, referral programs outperform other recruitment strategies on every level (cost-to-hire, time-to-hire, retention etc.). Unfortunately, while referral programs are great for traditional recruitment metrics, they reflect the same anti-diverse biases prevalent throughout our industry. According to PayScale, white people represented 70 percent of those referred, with black women the least represented at 13 percent.

It’s clear then that we may be widely embracing recruitment methods that undermine any commitments to diversity. AI alone cannot be relied on for fair and representative hiring policies. Instead, technology can be harnessed to provide better recruitment processes, such as software for blind psychometric assessments, which can remove a lot of preliminary bias that can come with simple resumes or interviews. It can be used to leverage role-specific performance data and understand personal attributes that are most advantageous, saving time searching through resumes and applications. AI can be a great help with redesigning recruitment to better serve diversity goals if given the right data and strategy.

There are simple methods that could be put in place here; adjusting bonus structures to incentivize referring minorities would be a good start, for instance. Furthermore, HR teams can look to other talent sourcing pools without sacrificing large recruitment fees. If the result of referral strategies tends to be white men, what other channels can be used to attract applications from women or people of color? These are all starting points.

Relying on pure human inclination doesn’t solve our diversity concerns, either. The inherent exclusion that comes with referral policies has to be acknowledged and steps need to be put in place to overcome it.


James Oyedele is global solutions director of programmatic at Adform.