DeepMind’s AI trained to play StarCraft 2 now better than 99.8% of players

DeepMind announced yesterday that its AlphaStar AI has reached the stage where it can beat 99.8% of StarCraft 2 players on the official Battle.net game server.
Starcraft Gamescom 2017
AlphaStar will likely reach the point where it can easily beat all human players soon. Photo: dronepicr via Flickr

DeepMind’s latest game AI, AlphaStar, has reached a level of ability not yet achieved by any other game AI.

Specifically, AlphaStar has become the first AI to reach the top league of any popular esport whilst playing without game restrictions.

DeepMind’s other well-known game AI, AlphaGo, hit the headlines back in 2016 when it defeated one of the world’s highest ranked Go players, Lee Sedol. Just a year later it comprehensively beat the world number one Ke Jie.

Later in 2017, AlphaGo was itself beaten 100 times to 0 by an improved version called AlphaGo Zero.

8:36pm Match 3 of AlphaGo vs Lee Sedol. The confidence of the human commentary is fascinating.
DeepMind’s AlphaGo AI made waves in 2016 with its historic win against Lee Sedol. Photo: Buster Benson via Flickr

The story of AlphaGo and AlphaStar’s recent success in StarCraft 2, a complex strategy game, are both examples of the remarkable pace at which game-oriented AI has improved over the past five or so years.

Just how impressive is AlphaStar’s achievement?

Blizzard Entertainment’s StarCraft 2 is one of the most popular and competitive esports out there. The level of skill required to reach the upper echelons of the player rankings is incredibly high.

AlphaStar is ranked at ‘grandmaster’, the highest rank possible for StarCraft 2 players. Not only has it achieved grandmaster-level proficiency, it has achieved this rank using all three of the StarCraft 2 races – Protoss, Terran, and Zerg.

This is effectively like learning to play the game three times over, as each race comes with its own strategic and tactical nuances.

AI usually has a number of inherent advantages over human players. After all, computers are just far better than humans at certain things.

But the AlphaStar team at DeepMind took this into account, making sure that the AI is restricted by the same limitations human players face.

As a result, AlphaStar had to view the game through a camera, and was only able to see the same part of the screen a human player would.

Additionally, the maximum click/action rate was limited to 22 separate actions every five seconds, to mimic the ability of human players.

How long did it take to reach this level?

Although AlphaStar achieved its top ranking after a series of matches played in August, the groundwork for this achievement was laid back in January.

According to reports by The Verge, AlphaStar was tested against 10 top-ranked players in a row at the start of the year.

The release of StarCraft 2 Void of the Legacy
StarCraft 2 is one of the most competitive esports around. Photo: Altostratus via Flickr

It was defeated by a player named Grzegorz “MaNa” Komincz in the final match of the series, so the team at DeepMind worked on additional improvements to the AI’s learning mechanisms.

The key findings of the AlphaStar project have revolved around learning mechanisms for AI systems in various environments.

When it comes to StarCraft 2, the researchers found that ‘reinforcement learning’ was largely ineffective due to the massive amount of variables present in the game.

The full findings of the AlphaStar project have been published in Nature, and the full library of games played by the AI are available to watch here.

1 Shares:
Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like