Progress toward closing the gender pay gap, globally, is egregiously slow. This isn’t for lack of trying: Many companies and governments have gender equity high on their agendas and are making concrete efforts to solve it, launching pay reviews and implementing policies. It’s not working all that well, though.
For example, since 2017, the UK has mandated that companies with more than 250 employees report their pay gap. This should be a deterrent to wage disparity between genders. But as the second year of reporting rolls round, we’ve already learned that several companies—including parts of HSBC and Virgin Atlantic—have gone backward.
But what if what’s really backward is the way we’re going about solving the problem of gender pay gaps?
Katica Roy is the CEO of Pipeline, a Denver-based startup which makes software to help companies track their gender-equity metrics. She has thought a lot about how to solve gender gaps in the workplace—and pay, she says, is merely the symptom of a problem that starts much, much earlier than in the room where annual salary decisions are made. “The disease is the value that you place on your talent upstream,” when deciding who has potential, who gets promoted, and how each employee’s value is determined, she says. These things are influenced by deep biases which might be ingrained in company culture, or in society itself, and which can’t always easily be identified. ”Starting with pay is sort of the afterthought,” Roy says.
The tool Roy has built integrates with companies’ HR platforms, working in the background to search internal databases of information about employees who could be suited for other roles or be ready for promotion. If an internal hire were to be made, for example, the system would assign each candidate a score based on their skills and experience, the text in their performance reviews, and the makeup of the team they’d be joining.
Pipeline uses the tools of artificial intelligence to help companies close their equity gaps; Roy says there are checks and balances for the algorithms to guard against “adverse selection,” where candidates are preferred simply because of their gender. Their tool is designed, she says, to tackle unconscious bias, without introducing bias of its own.
Roy is by no means the first person to note that the problem of getting more women into top jobs is a pipeline one. In recent years, there’s been a global awakening to the need to nurture female employees in male-dominated industries—to make sure they don’t drop away when caring for children, to promote them on a par with men, and to encourage them back into roles after career breaks. And even before careers begin, the shortage of women studying in fields like engineering and math has led to calls and programs for encouraging study of subjects that will better equip girls for future jobs. But Roy’s perspective is interesting because, as an entrepreneur who has worked on solving the problem, she has first-hand experience of what works, and what doesn’t.
Like any startup, she says, Pipeline tried different things, including focusing on pay first with an early customer. It didn’t work. “You were having a quantitative value discussion when really you’d already made that value decision,” she says.
A raft of companies is proposing tech solutions, many of them utilizing artificial intelligence. Plum promises to use surveys and the tools of organizational psychology, as well as AI, to help companies make the right hires. Pymetrics, another startup, combines tools based on neuroscience with machine learning, and says it tests its algorithms on a dataset of 50,000 previous candidate profiles to make sure the system itself doesn’t have bias—a problem that has hampered efforts by companies like Amazon to build intelligent tools for hiring.
Unconscious-bias training for humans is a useful piece of the puzzle, but Roy (who, of course, has investment in the need for a tech solution) says that’s not enough. Natural language processing, one of the tools of AI, is a good example. Roy’s team put computers to work on the text of employees’ performance reviews (she didn’t want to give specific numbers but said they had done this process on “over a thousand” reviews), and found that given the same text, a male employee would be given a higher numerical score compared to a woman. That’s the sort of bias that’s very hard to see, even for a human who is on the lookout for unfair treatment.
“It’s great that we have unconscious bias training, it’s great that we have programs, but if that was going to solve the problem, it would already have solved the problem,” she says. She points to Google, which spent $150 million on bias training in 2015 alone, and still has systemic problems, evidenced by the Google walkouts of November 2018.
For Roy, all the decisions we make along the way count. “The pipeline leaks very early. It doesn’t just leak at the CEO and boardroom level. It leaks 25 years before that ever happens,” she says. Maybe AI-driven tech solutions are only one tool, but with a problem that runs this deep, it’s looking increasingly like we need the whole arsenal.