The massive global race to teach an AI to beat Starcraft II is under way

The fight is on.
The fight is on.
Image: Blizzard
We may earn a commission from links on this page.

If it takes a village to raise a child, it’ll take some of the world’s top computing minds to train an AI to to operate like a human.

This week DeepMind and Facebook functionally kicked off the race to create an artificial intelligence specifically capable of mastering the complex real-time strategy game Starcraft II. DeepMind, a sister company to Google, and Blizzard, which created Starcraft, last year launched the software tool that allows an AI to act as a player in the game. What DeepMind and Facebook have now done is release vast datasets recording tens of thousands of games by human players. The companies will use this data to train deep-learning AI algorithms to play the game.

DeepMind’s data.

And by opening their datasets to the world, DeepMind and Facebook will massively multiply their own resources, creating a global community around the problem they’re both trying to solve. Anyone else can now use the data to train AI software, share their results, and ask questions of others in public internet forums. Theoretically, the entire community will benefit from everyone’s successes. This open-source method, originally developed for conventional software like the Linux and Android operating systems, is now increasingly being used in AI research too.

What makes Starcraft worth tackling in this crowdsourced way is its complexity. Games that AI has already mastered, like chess and Go, involve controlling a few dozen pieces in a small grid of squares. A player in Starcraft not only controls hundreds of units like builders and warships across a vast virtual territory, but must also manage resources like minerals and gas, plan ahead, anticipate opponents’ strategies, and remember how those dynamics have changed moment to moment. An AI that can play Starcraft well will be a lot closer to one that can do useful real-world tasks like manage fleets of robots or autonomous vehicles—a big prize for the company that can commercialize it.

Deep learning, which is the variety of AI that learns from huge datasets, is popular for complex problems like Starcraft because it can figure out patterns in gameplay and learn them for itself. After being shown tens of thousands of playing strategies, a deep learning system sees how each strategy affects the outcome and identifies the best pieces of each one. (In a much simpler example, a deep learning system can look at hundreds or thousands of pictures of cats to learn the pattern of pixels that is associated with the concept of cat.)

Facebook and DeepMind aren’t pioneers in this effort: Starcraft already has a community of AI researchers trying to beat the game, who have had “almost zero support” from Blizzard, tweeted AI researcher Mike Cook. But the game company, which is working with DeepMind, has built tools to make it easier for this new wave of AI researchers to make their software.

Now that the data are public, we’ll be looking for how well the bots actually play—and hoping for a Facebook/DeepMind battle royal in the near future.