When the OECD tested half-a-million 15-year-old students around the world in a test known as PISA in 2012, US teens came in 27th place in math, below their counterparts in Estonia, Latvia, Vietnam, and Spain.
American adults, it turns out, are no more capable. And when it comes to digital problem-solving, they are literally the worst in the developed world.
The Program for International Assessment of Adult Competencies (Piaac) is a test given to around 150,000 16- to 65-year-olds in 24 developed countries around the world. It is meant to see what skills adults need to function in a knowledge-based economy, both at work and in life, and tests three areas: literacy, numeracy, and digital problem-solving.
Questions include things like (pdf) “which of these candidates received the least amount of votes,” with a chart listing four candidates and the number of votes they received (the most basic level), and “sort these emails RSVPing to a party into two pre-existing files, people who can come and those who cannot.”
The results are reported two ways: first, as averages on a scale of 0-500 and second, as proficiency, or what share of the test takers fall into various ability groups.
In literacy, US adults do okay: the average score across all 24 countries was 273, and the average US adult clocked in at 272, coming in 13th place. In numeracy, things were bleaker: US adults scored 257, significantly below the average of 269, putting them behind Cyprus, Poland, Estonia and the Slovak Republic for an 18th place finish.
In math, Americans with a high school diploma performed about the same as high school dropouts in other countries.
“What makes this so sobering is that this is not a high-level test of math or critical-thinking skills,” Stephen Provasnik, a research scientist at the National Center on Education Statistics, told Quartz.
In digital problem-solving, US adults came dead last, with a score of 274 compared with an average of 283.
No silver linings
What’s worse, the US has a larger share of low performers in every single area.
Nearly 70% of young adults in the US—aged 16 to 34—either did not finish school or have only a high-school diploma. The average in other countries is 73%, but a larger share of this group in the US scored at the bottom levels of proficiency in reading, math, and digital problem-solving than their international peers.
Proficiency is defined by breaking down performance into six levels. Countries want lots of people performing at high levels and not so many testing at the bottom levels. For the US, that is not the case. In literacy, Americans have a larger percentage of young adults performing at both the top, which is good, but also at the bottom, which is bad. In math, it’s worse: the US has fewer excelling at the top, and more falling in the lower levels.
Digital problem-solving is even worse:
Maybe the test is really hard?
Here’s a level-2 numeracy question from the Piaac test. The test taker gets a bar graph with the number of workers absent in year 2, graphed by month. Then they get a table with the data. Two of the data points are wrong, and they have to spot that mistake.
Among young adults, 27% of Americans with a high-school diploma could not solve a problem like this, compared with only 13% internationally.
Indeed, the US’s lackluster PISA results look good compared to these latest scores. “PIAAC is not as difficult as PISA,” said Provasnik. “PISA is an assessment at the end of secondary school that is designed to measure the skills of high school students.” PIACC? “It’s measuring basic workplace skills.”
Unfortunately, this does not seem to be down to a lack of investment, which would be easily addressed. Slovakia spends about $53,000 (pdf) per student and performs at about the same as the US, which spends about $115,000 per student. Korea, the highest-performing OECD country in mathematics, spends well below the average per student.
That means US high schools aren’t preparing students for the workplace, and even though the US may boast top-ranked universities (usually measured by what their research departments produce), they are not doing a good job of preparing students for the real world.
“Clearly, we have some work to do in this country,” Peggy Carr, the acting commissioner of the US government’s National Canter for Education Statistics, told NPR. That seems like an understatement that even the least-savvy test takers among us can understand.