Of humanity’s many notable achievements since the end of World War II, perhaps the greatest is simply that we haven’t yet destroyed ourselves.
In the past 70 years, we have sent men to the moon, eradicated small pox, and created the modern information technology revolution. But at the same time, we have built ever more powerful killing machines, created biological agents of warfare, refused to take action on a warming planet, starting playing god with other organisms, and learned how to use software to cause physical damage. It is a small miracle we’re still here to appreciate our advancements.
For seven decades, one publication has been dedicated to watching these threats—and warning of their consequences. Established in 1945 as the world entered the nuclear age, the Bulletin of the Atomic Scientists was founded as a forum for public debate on the dangers of nuclear technology, by scientists who worked on the Manhattan Project. In 1947, the magazine created the “Doomsday Clock,” which every year ascertains how close we are to midnight—a metaphor for global annihilation. This month, the Bulletin, which won America’s National Magazine Award for General Excellence in 2007, celebrates its 70th anniversary with a free issue looking back at seven decades of scientists writing in the public interest.
Always with us
It is easy to think today that the threat of nuclear annihilation receded with the end of the cold war. Indeed, even the science and security board of the Bulletin, responsible for setting the Doomsday Clock, in 1991 estimated that humanity was as far as it had ever been from destroying itself. Both the Bulletin’s board and the rest of us were quickly disabused of that notion.
Exchanges as low as 50 weapons or 100 weapons would create worldwide nuclear winter. In the years since, India, Pakistan and North Korea have tested nuclear weapons, Iran seems intent of acquiring them, and Russia and the United States have embarked on an ambitious—and expensive—modernization program for their weapons. John Mecklin, the editor of the Bulletin, estimates that the US is spending close to $1 trillion over 30 years to modernise its “triad,” or the three legs of its nuclear weapons capability: missiles, submarines, and airborne bombers. Russia is doing the same.
“These improvements—if you want to call them that—are not just incredibly expensive, they are designed to make the weapons more reliable, harder to shoot down, more accurate. It is what many view as the beginning of a modernization arms race. Not just the US and Russia. All nuclear weapons countries are involved in upgrading and modernizing their arsenals,” says Mecklin.
Despite perceptions that the era of preparing for complete nuclear annihilation has passed, “the situation is just as dire it ever was,” says Mecklin. Global arms reduction efforts have stalled. Though down from the heights of the 1980s, there remain thousands of commissioned nuclear weapons ready to go at a few minutes’ notice. And recent tension between the West and Russia haven’t made the world any more secure. “If there were a nuclear confrontation, the likelihood that civilization would end is very high even if it’s a relatively small confrontation. There are some studies that show that exchanges as low as 50 weapons or 100 weapons would create worldwide nuclear winter.”
The dark side of technology
It is not just nuclear weapons that threaten humanity. The Bulletin has in the past decade formally expanded its remit to include new threats to civilization, namely climate change and biological warfare. The magazine also informally looks at other threats, such as cyberwar that could spark real world confrontation; synthetic biology that could create new pandemics; and lethal autonomous weapons (or “killer robots”) that can target and fire without human intervention. The Bulletin also is beginning to look at artificial intelligence.
“Killer robots” can target and fire without human intervention. The new threats are, like the old threats, products of human ingenuity and technological advancement. They arise from what are called “dual-use technologies.” Nuclear energy provides cheap, clean energy, and nuclear medicine saves lives, but nuclear weapons are catastrophic advancements on conventional weapons. Similarly, information technology has enriched the world in myriad ways, synthetic biology could save many more lives, and autonomous robots could make the world a safer place—but all these technologies also could be used to cause harm.
What is the solution to emerging threats? Mecklin argues in favor of new organizations tasked with paying attention to developments in these areas. The International Atomic Energy Agency monitors the nuclear industry. The Biological Weapons Convention covers bio-warfare. But cyberwar and other emerging technologies have no globally agreed-upon standards to which states must adhere. Creating these would be a first step.
Closer to midnight
The decision to pay attention to new threats has paid off for the Bulletin, as has going digital-only. Mecklin says the Bulletin’s website gets some 100,000 unique visitors a month, up 50% on last year. Moreover, readers download some 500,000 articles a year from the bimonthly subscription magazine on Sage Publications. The numbers are small compared to mainstream news media, but the readers are an influential bunch, ranging from American and Russian defense officials to scientists in the Middle East and South Asia.
You just have to hope that people in power in major countries realise the power of these technologies. The Doomsday Clock, too, remains a potent ambassador for the magazine. The board makes a decision every year about whether we have come closer to or moved further away from destroying ourselves. The next decision will be announced on Jan. 22. Considering the disastrous year global security had in 2014, it would seem foolish to bet against ticking ever closer to midnight.
Looking at the myriad new threats now facing humanity, Mecklin poses a crucial question in his introduction to the 70th anniversary issue (available here): “In many ways, the question today is the same one that confronted the atomic scientists in the fall of 1945: Can humans learn to control the potentially catastrophic misuse of the technologies they create, or will they let those technologies destroy them?”
Asked by Quartz to answer his own question, Mecklin says this:
“I do this job. If I didn’t believe that humans could in some way control technologies and not destroy themselves I would not be in the job I am. More than probably anybody, I realize what a challenge the nuclear age has been. And the response has been far from perfect. But in the end you have to figure that humanity doesn’t want to kill itself. And you just have to hope that people who get in power in major countries in the world realize the power of these technologies.”