StarCraft Could Be the Next Game AI Destroys Us At

Image: Mike Prosser/Flickr/Screenshot
Image: Mike Prosser/Flickr/Screenshot

My experience with StarCraft was probably the stupidest possible: I really liked the books. Sure, I played the games, but mostly I played single-player because I was not very good, and enjoyed the story. But for some reason, it was Sarah Kerrigan’s tale told through the novelization that resonated with me the most.


Anyway, I was, and still am, very dumb. Others actually enjoy playing the complex real-time strategy game (against human players) that puts alien races in conflict with one another. But in the future, players may have to go up against AI opponents, too. That’s because Facebook is now releasing an enormous AI-training data set, consisting of over 65,000 StarCraft replays broken into 1.5 billion frames, equaling 365GB of data, the largest set of StarCraft replays yet by a factor of ten. Meanwhile, Google’s DeepMind and Blizzard are releasing tools to train AI on its own large Starcraft II data set.

Facebook researchers published a paper on their data set Monday on the arXiv preprint server, explaining that StarCraft is a complex game to learn, for which expert playthroughs exist (since there are human experts). There’s a lot of data, and lots of scenarios to train a neural network on. AI can potentially use this data to learn how to classifying different gameplay strategies, improve gameplay without a reward, predict the future of games, or learn how to play given only a demonstration and no instructions.

The researchers don’t specify whether the data is from StarCraft, StarCraft: Brood War, or StarCraft II. But seemingly coincidentally today, Google’s DeepMind and the game’s creator Blizzard released tools to train artificial intelligence on StarCraft II, as reported by The Verge, after announcing a partnership last year. Oriol Vinyals, a Google DeepMind researcher, explained to them that StarCraft’s “fog of war”, which hides parts of the map you haven’t explored, requires the computer remember the enemy’s locations and continue to scout as players must do when they play.

I’ve reached out to Facebook for comment, given the fact that Google and Blizzard released tools and a data set a day later, as well as to see whether and how the two data sets are related.

It might be soon that an AI kicks your ass at the online multiplayer games you love. But Byun Hyun Woo, world champion StarCraft II player, was skeptical, as reported by MIT Tech Review earlier this year. He told them: “I don’t think AI can beat [a professional player], at least not in my lifetime.” But AI has proved human players wrong about their assumptions at other games, like Go.

And on top of that, folks at the IT University of Copenhagen, Denmark, are already using what they’ve learned from Alpha Go to train AI on 630,000 moves from 2000 StarCraft games, reports New Scientist.


Anyway, you should read the books because they’re very good.

[arXiv, The Verge, Tech Review]


Former Gizmodo physics writer and founder of Birdmodo, now a science communicator specializing in quantum computing and birds



The pro player is right that AI has a long way to go, assuming its subject to similar constraints as its human competitors. However, if we allow the AI to do double or more the operations per minute of a top pro player with pixel perfect selection and assignment mechanisms..... Then it will be however long it takes DeepMind to create a decent AI, maybe a year or two. But to say its a fair match when the human is using a mouse and keyboard and is constrained by the accuracy and speed of those mechanisms is definitely a stretch. I would like to see the AI out strategize a human consistently, while still being subject to the same limitations as the human. Only then could we say AI is truly destroying us in SC2, IMHO