The US passed China with a supercomputer capable of as many calculations per second as 6.3 billion humans

The US government’s shiny new toy.
The US government’s shiny new toy.
Image: Oak Ridge National Laboratory
By
We may earn a commission from links on this page.

For the first time in five years, the world’s fastest computer is no longer in China.

Yesterday (June 8), the US Department of Energy’s Oak Ridge National Laboratory announced the top speeds of its Summit supercomputing machine, which nearly laps the previous record-holder, China’s Sunway TaihuLight. The Summit’s theoretical peak speed is 200 petaflops, or 200,000 teraflops. To put that in human terms, approximately 6.3 billion people would all have to make a calculation at the same time, every second, for an entire year, to match what Summit can do in just one second. (Another way to see it: if you want to go toe-to-toe with Summit yourself, settle in. You’ll be making a calculation every single second for the next 6.3 billion years.)

Supercomputing technology has been improving rapidly in recent years. Just over a decade ago, the world hadn’t yet built a machine that could crack even a single petaflop (or 1,000 teraflops). Now, in just a year, we’ve gone from 125 petaflops to 200.

At eight times the speed of the US’s previous fastest computer, Summit is a major advance for the country’s supercomputing efforts. The Oak Ridge team says the system, which cost $200 million to build, is the first supercomputer made bespoke for use in artificial-intelligence applications. That’s important because, in many ways, AI has become the new space race, with countries all around the world investing huge amounts of money into the field. China and the US are at the front of the pack, but Russia, the UK, the EU, and Canada are all deeply invested in AI research as well.

And despite the US now owning the world’s fastest machine, China still operates more supercomputers overall.

Supercomputers have myriad uses, many of which are essential to national security and the general welfare of the public. In the US, for example, the National Oceanic and Atmospheric Administration uses supercomputers to predict climate trends and model weather patterns. The Energy Department uses them to run nuclear simulations, and to mine data to find oil and natural gas deposits. The National Security Agency and similar government bodies rely on supercomputers to crack encryption codes. These powerful machines are necessary for handling the massive datasets required advanced genomic research, one of the most promising fields in medical science.

More powerful machine learning and neural-network capabilities would advance all these fields, and, presumably, Summit will lead the way.

As impressive as Summit is, many see it as just a stepping stone to the real target: building a machine that can carry out an exaflop, which is 1,000 petaflops.

The US government is reportedly already talking to manufacturers about developing several exaflop supercomputers, and energy secretary Rick Perry said yesterday that they want to deliver the first by 2021. These efforts are largely perceived as taking place with an eye towards staying ahead of China in the supercomputing race.