Hot topics

AIs are being raised on video games...not to beat us, but to join us

AndroidPIT Xiaomi Black Shark 6700
© nextpit by Irina Efremova

Sshhh. Turn off your smartphone, disable your Google Home. Can the machines hear us? No? Good. Fellow humans, we have to talk. AI is beating us at video games. Hard. It's embarrassing, but besides biological pride, why should we care? Why is so much effort and money put into teaching AIs to get good at games? The thing is, video gaming isn't playtime for robots. In fact, they're going to school.

Back in 1997, a game took place that rocked the world. When computer IBM supercomputer Deep Blue beat Garry Kasparov in a chess match, it was seen as a sign that computer intelligence had surpassed human, but Deep Blue was no genius. Instead, the machine champion won by sheer brute force to evaluate millions of positions at once. Sad for the mystique of chess, but humanity was safe from AI competition when it came to most other games.

The ancient Chinese game of Go, which occupied a similar cultural position to the game of Chess, was the next to fall. Go has simpler rules than chess, but many more possible rules, and generally requires more intuition than raw brainpower. It took until 2015, when Google DeepMind's AlphaGo program surprisingly defeated Lee Sedol, for an AI champion to emerge. The secret to success? Deep learning.

Of course video gamers have been battling AI for many years, and the fact is that our AI adversaries have been laughable easy to beat and exploit, forcing game devs to resort to giving the computer unfair advantages in-game. I can't blame them, since it's difficult for simple AI to present a challenge to experienced humans.

In fact, you can have a go at video game AI programming yourself. To get some insight into this I tried Gladiabots, a game where you program a team of robots to fight and score goals in games against similarly programmed teams. Easy enough for a novice like me, I saw just how complex it can be to programs priorities, sequences and conditions for relatively video game rules and goals. And that's just with bots following instructions. When you put learning into the mix, that's when it really gets interesting.

Educating robots through play

Whenever AI beats the most talented human players at a video game, there's some buzz in the news. Most recently the OpenAI Five, a team of computer algorithms, beat former pro players of the popular Dota 2, and it wasn't even a close-run thing. Dota 2 is a MOBA (multiplayer online battle arena), which demands team coordination, long-term strategy, resource management and also the micro-management of heroes battling on each time.

The progress of the OpenAI Five at mastering Dota shows just how swiftly AIs can develop their skills with deep learning. The Five were losing to amateur players back in May. By June, the AI had learned enough to defeat higher-level casual players, and now it's good enough to wipe the floor with former pros...the next step are the current pro champions. This AI team is learning at the rate of 180 years a day, OpenAI CTO Greg Brockman claimed in an interview with The Verge. Obviously, no human lives long enough to learn a video game in this way. Instead, we rely on our general intelligence and knowledge picked up through interacting on the world and learn the game on top of that. But video games can also help teach a more general intelligence, as we'll get into later.

All in good fun, but what if you don't care about who wins whatever?. We're not feeding these masses of data to AI just to watch them play games. In fact, we're teaching them skills to help us in our day-to-day life.

The goals: navigation and co-operation

We're increasingly getting to the point where having AI co-operate with us in a shared space in the real world is closer to reality. But we're not there yet. A machine has different senses to us and if it needs to navigate the world (for example, if it's a self-driving car, or some kind of robotic assistant), it needs be able to navigate physical space and react to other moving objects in the world.

Ideally, it should be able to co-operate with a human in that same space. Video games are virtual worlds that attempt to simulate the real world, but are designed to be navigated by humans. This makes games a useful bridge between human and machine perception.

For example, Microsoft started Project Malmo back in 2016 with the aim of using the exploration and building mechanics of Minecraft's open-ended virtual world to help AI researchers teach machines how to navigate real space. Project Malmo is still ongoing and proves that it's not only simulated violence that's useful for training.

But speaking of violence, Google subsidiary DeepMind trained AI how to play Quake III's capture the flag mode, with the bots having no more information to start with that what a human would have i.e. visual input from the screen.

Yet AI was still able to master the rules of capture the flag, and come up with different co-operative strategies and tactic i.e. it learns collective intelligence and co-operation towards a shared goal in a complex environment - vital for our future robot companions.

Complex MOBAs like Dota 2 generate huge amounts of information for AIs to digest and learn from, but more importantly, teach AI skills that are valuable to us. Working towards a shared goal, fast problem solving, long-term planning, and teamwork...imagine all this in collaboration with humans, not in competition with us. On the chessboard or on the video game maps, AI clashes against us as our enemy, but it's actually learning how to be our companion.

So in the spirit of friendly competition, in what arena can we look forward to being spanked by AIs next?

The next milestone: StarCraft II

Chess and Go may have fallen by the wayside but there is one competitive game that holds a similar reverence in the gaming world: StarCraft and its successor, StarCraft II. Often compared to Chess at the competitive level, StarCraft is a stadium-filling eSport in South Korea, home to the most of the top pro players.

In StarCraft, players must scout the map, secure and extract resources, build their base and their army (from a variety of units with different strengths, weaknesses, and resource costs), and master both long term strategy and the fine art of micro-manage their units in real time when they encounter the enemy. Even the best human players can reach up to 600 actions per minute (APM) or 10 actions per second. Obviously APM is no problem for computers, but learning everything else to a level where they can beat the strategic minds of the human pros is.

Last year, DeepMind partnered with StarCraft creators Blizzard Entertainment to release a StarCraft II AI research project, but we haven't yet seen any breakthrough games like OpenAI's success at Dota 2. It's a more complex game, and the going is slow, but there's more going on that just making a good StarCraft II player.

starcraft2ai
Ai practices its tasks in StarCraft II. / © DeepMind

In the latest check-in with the StarCraft II AI project, Oriol Vinyals, a research scientist at DeepMind, stressed that the goal of the project was to make AI smarter in general, not just good at one specific game:

DeepMind is building what people call “AGI”—artificial general intelligence. You’re not specifically building an agent to play one game, but you want to understand what the learning paradigm is, so that this agent could play any game without much prior knowledge. I thought it would be very challenging, and quite fun to build a bot where, instead of writing the rules, we just provide the agent the screen. “Here is the mouse and a keyboard. Go ahead, start interacting with the game, try to get better at it.”

It remains to be seen whether an AI capable of beating the best of the best players will emerge from this, but I want to be watching at what will be the next great milestone a la Kasparov vs Deep Blue. Even more exciting, that through these simulations of war, humans and AI are becoming a little closer to mutual understanding.

Do you think that video games actually teach good skills for the real world? Do you look forward to machines that can see and interact in the world like us?

 The best gaming monitors at a glance

  Best gaming monitor up to $400 Best gaming monitor up to $600 Best gaming monitor up to $800 Best gaming monitor up to $1,000 Best gaming monitor for consoles
Model
Image LG Ultragear 27GP850P - product image Asus ROG Strix XG27AQ - product image BenQ MOBIUZ EX3210U - product image Asus ROG Swift PG27AQDM - product image Gigabyte M32U - product image
Offers
Go to comment (0)
Nicholas Montegriffo

Nicholas Montegriffo
Editor

A cyberpunk and actual punk, Nicholas is the Androidpit team's hardcore gamer, writing with a focus on future tech, VR/AR, AI & robotics. Out of office, he can be found hanging around in goth clubs, eating too many chillies, or at home telling an unlucky nerd that their 8th level wizard died from a poisoned spike trap.

To the author profile
Liked this article? Share now!
Recommended articles
Latest articles
Push notification Next article
No comments
Write new comment:
All changes will be saved. No drafts are saved when editing
Write new comment:
All changes will be saved. No drafts are saved when editing