When the US entered the nuclear age, it did so recklessly. New research suggests that the hidden cost of developing nuclear weapons were far larger than previous estimates, with radioactive fallout responsible for 340,000 to 690,000 American deaths from 1951 to 1973.
The study, performed by University of Arizona economist Keith Meyers, uses a novel method (pdf) to trace the deadly effects of this radiation, which was often consumed by Americans drinking milk far from the site of atomic tests.
From 1951 to 1963, the US tested nuclear weapons above ground in Nevada. Weapons researchers, not understanding the risks—or simply ignoring them—exposed thousands of workers to radioactive fallout. The emissions from nuclear reactions are deadly to humans in high doses, and can cause cancer even in low doses. At one point, researchers had volunteers stand underneath an airburst nuclear weapon to prove how safe it was:
The emissions, however, did not just stay at the test site, and drifted in the atmosphere. Cancer rates spiked in nearby communities, and the US government could no longer pretend that fallout was anything but a silent killer.
Congress eventually paid more than $2 billion to residents of nearby areas that were particularly exposed to radiation, as well as uranium miners. But attempts to measure the full extent of the test fallout were very uncertain, since they relied on extrapolating effects from the hardest-hit communities to the national level. One national estimate found the testing caused 49,000 cancer deaths.
Those measurements, however, did not capture the full range of effects over time and geography. Meyers created a broader picture by way of a macabre insight: When cows consumed radioactive fallout spread by atmospheric winds, their milk became a key channel to transmit radiation sickness to humans. Most milk production during this time was local, with cows eating at pasture and their milk being delivered to nearby communities, giving Meyers a way to trace radioactivity across the country.
The National Cancer Institute has records of the amount of Iodine 131—a dangerous isotope released in the Nevada tests—in milk, as well as broader data about radiation exposure. By comparing this data with county-level mortality records, Meyers came across a significant finding: “Exposure to fallout through milk leads to immediate and sustained increases in the crude death rate.” What’s more, these results were sustained over time. US nuclear testing likely killed seven to 14 times more people than we had thought, mostly in the midwest and northeast.
When the US used nuclear weapons during World War II, bombing the Japanese cities of Hiroshima and Nagasaki, conservative estimates suggest 250,000 people died in immediate aftermath. Even those horrified by the bombing didn’t realize that the US would deploy similar weapons against its own people, accidentally, and on a comparable scale.
And the cessation of nuclear testing helped save US lives—”the Partial Nuclear Test Ban Treaty might have saved between 11.7 and 24.0 million American lives,” Meyers estimates. There was also some blind luck involved in reducing the number of poisoned people: The Nevada Test Site, compared to other potential testing facilities the US government considered at the time, produced the lowest atmospheric dispersal.
The lingering effects of these tests remain, as silent and as troublesome as the isotopes themselves. Millions of Americans who were exposed to fallout likely suffer illnesses related to these tests even today, as they retire and rely on the US government to fund their health care.
“This paper reveals that there are more casualties of the Cold War than previously thought, but the extent to which society still bears the costs of the Cold War remains an open question,” Meyers concludes.