The Minneapolis public school district, one of the largest school districts in Minnesota, hires hundreds of teachers a year across roughly 70 schools. The teaching staff of 2,800 is largely white (86%), but it serves a diverse student body. Some 70% of the district’s 30,000 students are non-white, 21% are English-language learners, and 65% are eligible for subsidized lunch.
It’s the kind of American school district (i.e. mostly poor and non-white) known for having higher-than-average attrition rates for teachers, and especially teachers of color. On both counts, the Minneapolis district is no exception. In 2018, it lost 10% of its white teachers and 12% of its teachers of color. That’s versus a national turnover rate last measured at roughly 8%, according to a Learning Policy Institute analysis (pdf) of federal figures. The result for Minneapolis (pdf) is a faculty that is, on average, less experienced than that of richer districts in Minnesota, state figures show, and less diverse than intended.
Teacher turnover has been connected in multiple studies to lower student achievement. It’s financially burdensome, costing upwards of $20,000 per teacher replacement in urban school districts, says the nonpartisan, nonprofit Learning Policy Institute. And it’s a bigger problem in the US, the institute says, than in places like Finland, Singapore, and Ontario, Canada, which are known for their high-achieving school systems and have attrition rates closer to 3% or 4%.
One obvious, if complex, way of addressing turnover is to increase retention of existing staff. But what if school districts could head off some of the most probable attrition scenarios at the outset, simply by making smarter hiring decisions? What if there were a way to predict which candidates for teaching jobs would be more likely to succeed in the role, and less likely to quit, based on cues in their resumes and work histories?
Those were the questions that a team of academics from the University of Minnesota and the University of British Colombia recently sought to answer for the Minneapolis district. Synthesizing information from more than 16,000 external applications collected by the district between 2007 and 2013, the researchers were able to look at the work history of every applicant—and see how those who got hired worked out.
Their findings, published in October in the Journal of Applied Psychology, suggest that using machine learning to screen teachers’ job applications could improve the quality of hired teachers and reduce turnover risk, while increasing diversity by shrinking the chance for bias during the hiring process.
Roughly 10% of the annual demand for teachers in the US can be chalked up to growth in schools and school districts, the Learning Policy Institute says. The vast majority of hiring is to replace teachers who’ve quit.
Some teacher turnover can’t be avoided; about a third of the annual departures nationally is due to retirement. But the balance is from early- and mid-career teachers exiting the profession.
After helping to write a paper about performance pay for teachers in Minnesota in 2014, Aaron Sojourner, a labor economist at the University of Minnesota’s Carlson School of Management, started talking with an administrator at Minneapolis Public Schools about their shared interest in improving school districts, with a focus on teacher talent.
They wondered whether hiring managers could go a step beyond their usual assessments of resumes and online applications to predict performance and turnover risk.
Like a lot of employers, school districts are increasingly turning to recruiting software to help ease—and speed up—the hiring process. But even then, managers may overlook telling details of an applicant’s information, and allow their biases to creep in.
Sojourner and his fellow researchers hypothesized that both these shortcomings could be solved with the help of machine learning, a branch of artificial intelligence and method of data analysis to identify patterns and make decisions with minimal human intervention.
Backed by a $400,000 grant from the US Department of Education, the researchers dug into seven years’ worth of hiring data from Minneapolis Public Schools. Using a combination of machine learning and economic and psychological theory, they developed a predictive model of the effectiveness and retention probabilities of the applicants. And because more than 2,200 of the roughly 16,000 applicants had already been hired, the researchers’ predictions could be tested against several years’ worth of outcomes in terms of both teacher performance and tenure.
The main predictors the study examined included work history (with knowledge, skills, and tasks turning out to be far more relevant than past titles); the length of tenure in previous jobs; and the reasons for leaving previous jobs (i.e. layoffs, a desire to leave a bad job, or an opportunity to take a better job).
When it comes to length of tenure, past behavior is sometimes the best predictor of future behavior. Overall, teachers with relatively short tenures in previous jobs tended to be less effective and leave more quickly.
But the probabilities were nuanced. Applicants who had left a previous job for a better one tended to be better performers, for example. Departures that were framed negatively (i.e. due to bad management, dissatisfaction with colleagues, or exhaustion) tended to be indicators of worse performance. And those who had left previous jobs involuntarily—due to budget cuts or layoffs, for instance—tended to be lower performers compared with those who left jobs voluntarily.
The researchers were able to use machine learning to identify words or phrases signaling different behavioral motives for leaving a previous job, in some cases recognizing patterns that a hiring manager might miss.
“This is where machine learning can help us analyzing these texts because each of us may you read and interpret these texts differently,” says Sima Sajjadiani, the lead author of the study on the Minneapolis applicants.
Sajjadiani, a University of Minnesota PhD and an assistant professor at the University of British Columbia’s Sauder School of Business, says the district was surprised to learn, for example, that a candidate’s mention of passion in a job application (e.g., teaching has always been my passion) can in fact be a meaningful signal of the applicant’s approach to the job. She says hiring officers probably thought that people tend to overuse statements about passion in their cover letters, and, therefore, treated these statements as cliches that didn’t signal anything important. But the analysis suggests that they often do.
Perhaps less surprisingly, the study also found that when applicants had prior work experience that was relevant to the job for which they were applying, they tended to be more effective teachers and stayed longer with the school district. What was surprising was the variety of occupations that could provide people with relevant experience. For example, an applicant who previously worked as a bartender may have acquired some of the same skills or knowledge that teachers use, says Sajjadiani, whose research is focused on topics in human resources.
Using the work skills and job requirements laid out by the Occupational Information Network, a database sponsored by the US Department of Labor, Sajjadiani and her fellow researchers could estimate the probabilities of the similarity between each applicant’s work history and the profile of the job to which they were applying. Previous titles were not as good an indicator of success as specific skills and abilities, even if those skills and abilities were gained in fields other than teaching.
For now, the machine-learning analysis of the applications remains an intellectual exercise, a shadow system running at a remove from how the district works today. If Minneapolis Public Schools were to take the next step and apply the researchers’ methods to real-life hiring, the district would likely run into heavy resistance.
“We already know the factors that lead teachers to quit aren’t found on an application, they are found in underfunded classrooms and in kitchen table discussions about family budgets,” says Denise Specht, the president of Education Minnesota, a statewide union of nearly 90,000 educators. “No algorithm is going to overcome the fact that people leave the classroom due to resource reasons, no matter what the researchers claim about their software.”
She says the researchers’ goal of removing implicit bias from the hiring process is “laudable,” but that it misses the mark. She says teachers of color often lose their jobs because of factors outside their control, like budget cuts or biased decisions made by administrators.
Then there is the widespread concern about the biases that are being baked into algorithms themselves. Last year, Reuters reported that Amazon had been working on a hiring tool that turned out to exhibit gender bias. The computer models were trained to filter through resumes submitted to the company over a 10-year period and identify patterns to predict the best candidates. But because most of the resumes came from men, the algorithm ended up favoring male applicants.
For the Minneapolis teachers’ study, the researchers said they were careful in how they chose their variables, to help reduce the risk of race and gender bias. For instance, when it came to categorizing reasons for past job departures, the model excluded specific reasons such as relocation or caregiving duties, which could be associated with gender and end up penalizing women.
Sojourner says there are “legitimate concerns” about whether machine learning is going to increase bias in unintended ways. For instance, it’s possible that applicants of a particular race or background were terminated from previous jobs due to prejudice and discrimination by their past managers.
But it’s also worth pointing out that machine learning could reduce bias. That’s what appears to have happened in the hiring recommendations spit out by the researchers’ model, based on the model’s “adverse impact ratios,” which measure the difference in selection rates for applicants of different groups. For instance, the model’s recommendations suggested a hiring rate of non-white applicants versus a hiring rate for white applicants at a 1.03 ratio, versus a ratio of 0.93 for the conventional hiring methods used by the school district. (On gender, the adverse impact ratio was 0.99 in both cases, suggesting female candidates would be selected at roughly even rates under both hiring systems.)
Were the researchers’ selection methods put into practice, they could have important implications for job applicants who did not have the chance to follow conventional career paths—who tend to belong to minority groups—and provide more opportunities for those applicants, Sajjadiani says.
Algorithms can put a different lens on hiring, and force us to consider previously neglected factors buried deep in applications. But both Sajjadiani and Sojourner say their tool should only be one factor in the hiring process. “The challenge,” says Sojourner, “is to find the best way to support people’s decisions with information from analysis.”
While their study focused squarely on teacher selection, the researchers suggest that the findings could be generalized to other fields, such as medicine, social work, or other service-related jobs similar to teaching, where an applicant’s specific motivation, interest, and individual characteristics are important determinants of success. But the researchers know that if they are to see their methods adopted by employers, they will need to be patient.
“These are very high stakes for the hiring organizations and for the applicants,” Sojourner says. “And whenever you’re making changes to standard operation procedure, it deserves some scrutiny.”