The fight for $15 has become the rallying cry of low-wage workers in America. It is nearly impossible to support yourself with a low-paying job. The solution appeared simple: require employers to pay people more. In 2014 the city of Seattle passed a ordinance to increase the minimum wage from from $9.47 to $15 over two to four years (depending on company size and if they pay benefits).
Traditional economics suggested this would be a bad idea, since if you charge more for something then people want less of it. But an important study in the 1990s questioned this traditional relationship. Economists Alan Krueger and David Card studied a small wage increase, from $4.25 to $5.05, and it didn’t appear to have a large impact on employment in fast-food restaurants. This study spawned a contentious economic literature. And it led politicians, labor advocates, and pundits to conclude that negligible employment effects from small wage increases meant big increases would work, too.
At first glance this experiment appeared to be a resounding success. Employment increased in the Seattle area and pundits declared Economics 101 reasoning dead and buried. Then a group of economists took a closer look at the data in a paper published this week. They argue that total employment numbers are not revealing because the number of low-wage workers is too small to show up in the data, especially in Seattle, where an economic boom increased the demand for skilled workers. Even looking at restaurant-industry employment isn’t telling because restaurants contains many different types of workers earning more than the minimum wage.
To control for different types of workers in Seattle, the economists looked at employment and hours worked among those who earn less than $19 an hour. They estimate the first wage increase, from $9.47 to $11 an hour in 2015, had a small impact on employment and hours. But the increase to $13 one year later had a more dramatic effect. They estimate it resulted in low-wage employees working 3.5 million fewer hours per quarter and in 5,000 fewer jobs. They argue that, after accounting for hour and job reductions, low-wage employees ended up being paid $120 million less a year (by single-location Seattle businesses). This represents a loss of $125 a month for a low-wage worker.
The authors speculate that the work once done by entry-level, low earners is now done by more experienced, high-paid employees. This suggests the higher wage caused most harm to workers with the least experience and skills. It may have long-term effects because most minimum-wage workers get raises and earn more over time, even if that is getting harder to do.
The results suggest increasing the minimum wage to large, arbitrary round numbers isn’t costless. But the problem remains that low-wage workers in the US are struggling to get by. A better solution is the federal earned income tax credit, which boosts take-home pay by offering a taxpayer-financed subsidy for low-wage work. Some argue it subsidizes employers who underpay workers. But the Seattle study’s results suggest small-scale employers can’t—or won’t—pay low-skill workers enough to survive. The tax credit can make up the difference, and keeps people employed. It also targets the most needy: low earners who are supporting families, instead of teenagers who live with their parents.
A high minimum wage may sound like a simple way to provide a living wage. But the latest data suggests the costs are ultimately borne by the most needy.