How effective is artificial intelligence in removing racial bias in hiring?

The future of work is colorful.
The future of work is colorful.
Image: #WOCinTech Chat on flickr/ CC-BY-4.0
We may earn a commission from links on this page.

A growing number of tech companies are placing their bets on algorithms to reinvent talent acquisition and create a more inclusive workforce. In some cases, this might mean entirely removing traditional aspects of the hiring process.

Introduced in the nineties, applicant tracking systems (ATS), were created to help HR professionals organize the surge of applications that resulted from the growing use of the internet. Over the last several decades, ATS became increasingly advanced, using algorithms to sift through thousands of resumes based on various data. The promise was efficiency and blind hiring, but the algorithms have proven to perpetuate structural inequities in hiring.

Algorithms work by being fed data and then making choices based on that data. If a company’s data set is already skewed towards a specific demographic, like “Harvard graduates who worked at Goldman Sachs,” the algorithm will just perpetuate that. A good example of this is when Amazon realized its AI algorithm preferred male candidates in 2015. Amazon discovered that the algorithm penalized resumes with the word women and was biased against candidates that went to certain women’s colleges.

A new crop of tech companies believes that AI can solve the problem. Founded in 2013, Pymetrics describes itself as a “human-centered AI platform with the vision to realize everyone’s potential with fair and accurate talent matching.”

Pymetrics hiring software doesn’t assess a candidate’s skill set based on past performance. Applicants play a round of games based on neuroscience research exercises that reveal behaviors and skills and then match these skills to different jobs. For companies, these matches give access to a larger, diverse pool of qualified candidates that an ATS might have filtered out. For candidates, the matches expand their job search and potentially make them aware of roles they wouldn’t have originally considered.

“The whole idea behind Pymetrics is that instead of using a resume, you are looking at people’s cognitive, social, and emotional aptitudes [through a combination of neuroscience-based tests and machine learning to match job seekers with openings]. It’s also much more future-facing and potential-oriented, rather than backward-facing and sort of only talking about your past experiences. It’s a much more holistic, hopeful view of someone than, Oh, this is what you’ve done, and this is all you can do,” explained Pymetric CEO Frida Poli in a Quartz at Work interview. Pymetrics says its custom algorithms are rigorously tested for bias.

The concept of blind hiring is also the premise of startup Blendoor, a Tinder-style app for jobs that removes candidates’ gender, race, photos, and names. “[Blendoor] enables companies to hire based on merit, not molds. We are using performance data to train our algorithms rather than historical data sets.” said the company’s founder Stephanie Lampkin in an interview with All Raise.

These companies claim their tools are already working. Pymetrics, which already boasts clients such as the Boston Consulting Group, Accenture, Linkedin, and Unilever, says one of its clients evened out the gender and ethnicity split for applicants. Blendoor says it doubled the number of women it hires and substantially increased minority hires for a client. There is not year rigorous experimental evidence on the effect of these programs.

Gamification and blind hiring present possible effective solutions. But diversity, inclusion, and equity practitioners stress that they only address symptoms of a root issue and are not a panacea to address racial bias in hiring.

“Culture eats strategy for breakfast”, explains Kyanna Wheeler, a racial and equity strategist for the city of Seattle. “Even if a company does manage to improve its racial diversity, is its company culture able to sustain this strategy?” Instead, Wheeler says leaders should work to build an anti-racist company that rethinks the way the company operates. Companies need to dismantle the invisible cultural structures that drive them. “What you really need is a cultural shift. If your company values John Smith, and keeps hiring John Smiths, you have to unpack why this pattern exists.” This approach will take more work and requires accountability, she says.

The platforms themselves might even perpetuate cultural assumptions about talent. Blind hiring removes identity but still bases past work experience as a metric for future success. “We’ve been socialized to believe certain metrics have equal value. Real change means questioning assumptions”, Wheeler says.

There’s also a danger for AI in hiring to be too reductive. Gamification cannot convey context and lived experiences. “If you’re testing for risk, a poor person might be risk-averse but still innovative. Poor communities are known for innovation, they’ve perfected how to do more with less”, says Kirk Mead, Principal at the Carrington Consulting Group, a consultancy specializing in organizational development.

Also, these softwares don’t do anything about helping diverse job seekers find the right companies. After all, a diverse workforce starts with a diverse pool of candidates to pull from. “Algorithms can’t fix pathology. These solutions present a necessary component to solving racial bias but not a sufficient one,” says Mead. “We still have to address the core question of access and opportunity.”