Further progress in science and technology is putting humanity at risk of self-annihilation as it creates ”new ways things can go wrong.”
That’s the grim warning by professor Stephen Hawking, who is giving this year’s Reith Lectures at the BBC. While most of his lectures will focus on what Hawking is best known for—research into black holes—he still took the time to make his latest doomsday warning:
Although the chance of a disaster to planet Earth in a given year may be quite low, it adds up over time, and becomes a near certainty in the next thousand or ten thousand years
By that time we should have spread out into space, and to other stars, so a disaster on Earth would not mean the end of the human race.
The world-renowned physicist suggests that humanity needs to be “very careful in this period” until we are able to establish self-sustaining colonies in the next hundreds years. Hawking says that while further progress in science and technology cannot be stopped, or reversed he adds, “we must recognize the dangers and control them.”
This isn’t Hawking’s first warning against dangers of our own making. “The development of full artificial intelligence could spell the end of the human race,” he once told the BBC. Hawking was also among a number of leading scientists, engineers, and businessmen to sign an open letter against the idea of “starting a military AI arms race.”
The letter warns, “it will only be a matter of time” before AI technology reaches a point where it can deploy autonomous weapons capable of “assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group.”
Though he’s made a number of doomsday warnings, Hawking says he’s still optimistic that humans will recognize the dangers and find a way to cope.