The firms that sell AI tools to automate recruiting have started to work the pandemic into their pitches to prospective clients: As the economy tanks and the hiring process moves almost entirely online, AI recruiting tools offer a chance to save some money and make use of new troves of digital data on prospective candidates.
In fact, the field is expected to expand during the crisis and has been attracting new investment. It’s not just automated resume-sifting: There are firms competing to automate every stage of the hiring process. And while the machines seldom make hiring decisions on their own, critics say their use can perpetuate discrimination and inequality.
Writing job descriptions
AI firm Textio claims it can optimize every word of a job posting, using a machine learning model that correlates certain turns of phrase with better hiring outcomes. Companies hiring in California, for example, are advised to describe things as “awesome” to appeal to local job seekers, while New York employers are counseled to avoid the adjective.
Big name firms like LinkedIn and ZipRecruiter use matchmaking algorithms to comb through hundreds of millions of job postings to connect candidates with compatible companies. Smaller competitors, like GoArya, seek to differentiate themselves by scraping data from the internet—including social media profiles—to inform recruiting decisions.
Firms like Mya promise to automate the task of reaching out to candidates via email, text, WhatsApp, or Facebook Messenger, using natural language processing to have “open-ended, natural, and dynamic conversations.” The company’s chatbots even conduct basic screening interviews, filtering out early-stage applicants who don’t meet the employer’s qualifications. Other companies, like XOR and Paradox, sell chatbots designed to schedule interviews and field applicants’ questions.
Some AI vendors—including Ideal, CVViZ, Skillate, and SniperAI—promise to cut the drudgery of hiring by automatically comparing applicants’ resumes with those of current employees. Tools like these have faced criticism for recreating existing inequalities: Even if the algorithms are programmed to ignore traits like race or gender, they might learn from past hiring data to pick up on proxies for these traits—for example, prioritizing candidates who played lacrosse or are named Jared. Amazon developed its own screener and quickly scrapped it in 2018 after finding it was biased against women.
Recruiting firm HireVue, which boasts 700 corporate clients including Hilton and Goldman Sachs, sells an AI tool that analyzes interviewees’ facial movements, word choice, and speaking voices to assign them an “employability” score. The platform is so ubiquitous in industries like finance and hospitality that some colleges have taken to coaching interviewees on how to speak and move to appeal to the platform’s algorithms.
Peering into prospective employees’ souls
AI firm Humantic offers to “understand every individual without spending your time or theirs” by using AI to create psychological profiles of applicants based on the words they use in resumes, cover letters, LinkedIn profiles, and any other piece of text they submit.
Meanwhile, Pymetrics puts current and prospective employees through a series of 12 games to glean data about their personalities. Its algorithms use the data to to find applicants that fit company culture. In a 2017 presentation, a Pymetrics representative demonstrated a game that required users to react when a red circle appears, but do nothing when they see a green circle. “That game was actually looking at your levels of impulsivity, it was looking at your attention span, and it was looking at how you learn from your mistakes,” she told the crowd. Critics suggest the games might just measure which candidates are good at puzzles.