‘Top players need adaptable short and long-term strategies to grow and defend their bases while laying waste to the opposition. To complicate matters, players cannot see the whole “map” of the game, so decisions are made on partial information.‘

With 44 days of training, an AI model from DeepMind outperformed 99.8% of human players at StarCraft II. The sophisticated e-sport game is widely believed to be at the outer edges of human mastery.
The Guardian reports the achievement was the next ‘grand challenge’ after chess, Go, and poker, all games where AI systems have shown superior play.
The number of possible moves at any point is 10 to the 26th power. The Deep Mind system, named AlphaStar, was deliberately slowed so it couldn’t win by its calculating speed alone. Instead, it had to learn the strategy of play.
The research has implications for advances in areas such as self-driving cars, personal assistants, weather and climate modelling, and other applications involving complexity
SEE RELATED STORIES
- DeepMind’s AlphaStar AI is now a full-blown StarCraft 2 Grandmaster
SPACE.COM | November 1, 2019 | by Andy Chalk | ‘The machine is better than 99.8 percent of players on Battle.net, ‘under professionally approved conditions.’ - DeepMind’s AI has now outcompeted nearly all human players at StarCraft II
MIT TECHNOLOGY REVIEW | October 30, 2019 | by Karen Hao | ‘The results, published in Nature today, could have important implications for applications ranging from machine translation to digital assistants or even military planning.’ - Grandmaster level in StarCraft II using multi-agent reinforcement learning
NATURE | October 30, 2019 ‘AlphaStar was rated at Grandmaster level for all three StarCraft races and above 99.8% of officially ranked human players.’