top of page
Search

DeepMind’s AlphaStar Wins Against Humans in Starcraft II

Updated: Feb 24, 2023

Matches were organized between AlphaStar, the latest bot from Google’s DeepMind AI company, and two professional videogame players on ‘StarCraft II’, a real-time strategy game. The score-line was a heavy 1-10 defeat for the human players, with both players being thrashed 5-0 until one player managed to win. All games were 1-versus-1 matches. AlphaStar runs on a deep neural network that learnt how to play the game by observing real Starcraft II matches, and then undertook reinforcement learning over 14 days to rack up a phenomenal 200 years of gameplay experience, something that cannot be matched by any human. It had to balance short and long-term goals and adapt to unexpected situations. To reach this goal, researchers had to master numerous AI challenges, including game theory, imperfect information, log-term planning, real time and large action space. AlphaStar’s abilities do come at a price, as it required the computational expense of 50 GPUs. It was also given certain advantages versus a human player (essentially more information on a map that a human cannot see), and when this was taken away the human player was victorious. Human players were also observed to execute hundreds of actions per minute (APM), which is faster than AlphaStar – but with a far superior reaction time, AlphaStar was able to execute them more precisely. While Starcraft II is just a game, it shows that the latest generation of AI has the potential to handle other complex problems such as climate modeling, language understanding or military strategy.

bottom of page