Robot Soccer: AI Challenges the Next Frontier of Human-Robot Interaction

Category Machine Learning

tldr #

DeepMind's recent project of designing humanoid robots for soccer and teaching them the game with Deep Reinforcement Learning has stirred much fascination on social media, and offers the promise of AI agents acting with agility, dexterity and understanding. In the video clip of the project, a researcher is shown continually pushing down a robot trying to score a goal which illustrates the robots' resilience and 'abuse tolerance' from deep learning.

content #

Deep Blue vs. Kasparov. Watson vs. Ken Jennings and Brad Rutter. Deepmind vs. Atari. Alpha Go vs. Lee Sedol.The great machine vs. human competitions over the last few decades left no doubt about who's the boss.

All of those matches were played in a most courteous fashion. The challenges all involved intellectual pursuits.

But what will happen when AI takes on games of physical contact, when, for instance, robots engage in pushing, shoving or knocking opponents over? .

AlphaGo, a machine created by DeepMind, beat its human opponent, Lee Sedol, 4-1 in 2016

Researchers at DeepMind addressed that issue during trials of humanoid robots trained to play soccer. No human subjects were involved in these contests, not yet anyway. But there was some rough play.

In a paper released last week of the arXiv preprint server, Tuomas Haarnoja and more than two dozen colleagues reported on their successful efforts to teach complex movement skills and basic game strategy to robots.

In the 50th annual Loebner Prize, a competition founded to determine the best human-computer conversation, the AI robot had to pass a “Turing test” of sorts and convince people that it was human in 2016

The researchers said that numerous projects by others in recent years involving quadrupedal robots have yielded impressive results. Notable among them was Boston Dynamics' robot dog Spot that excelled at smoothly navigating unknown, unstructured and hostile environments.

Fewer projects have tackled bipedal movement. The researchers say two-legged mobility poses additional challenges concerning stability and safety. When it comes to sports, those challenges are even greater.

The AI software on Boston Dynamics' robot dog, Spot, is said to be able to track its environment, even if there is an obstacle in the way. It can also climb stairs, traverse through unknown places, and handles small and large objects

"Soccer requires a diverse set of highly agile and dynamic movements, including running, turning, side stepping, kicking, passing, fall recovery, object interaction and many more," Haarnoja said.

"Players further need to be able to make predictions about the ball, teammates and opponents, and adapt their movements to the game context. Players also need to coordinate movements over long time scales to achieve tactical, coordinated play." .

DeepMind created a AI model that can play 49 separate Atari arcade games. It achieved full proficiency in fifteen of those games, and a human-level performance in 25 games

The crew at DeepMind designed miniature humanoid robots with 20 controllable joints and used Deep RL (Deep Reinforcement Learning) to teach them basic soccer skills. They focused on context-adaptive movement skills such as "walking, running, turning, kicking and fall recovery." .

The robots exhibited "robust and dynamic movement skills," said Haarnoja. The report, titled "Learning Agile Soccer Skills for a Bipedal Robot with Deep Reinforcement Learning," was also posted on a Google blog last week.

AI has been proven to beat humans in playing poker and chess

The robotic soccer project differed from many earlier similar projects in that it focused employing the entire robotic body—not just hands or feet—to engage in strategic play.

"Creating general embodied intelligence, that is creating agents that can act in the physical world with agility, dexterity and understanding—as animals or humans do— is one of the long-standing goals of AI researchers and roboticists alike," Haarnoja said.

The use of robots in soccer, either with or without AI, has been proposed as a tool to improve field measurements, as well as to increase physical performance and on-ball skills

Publication of the DeepMind project stirred much discussion on social media, but one brief video clip drew particular attention. In the clip, a researcher is shown continually pushing down a robot trying to score a goal. The robot heroically proceeded each time to recover and get back on its feet.

Although clearly done to test and improve the robot's ability to recover from stumbles and other errors, the "abuse" stirred Twitter users to respond.

"It's hard not to anthropomorphize. My brain says, STOP BEING MEAN! Lol," said John Weller.

"At what point do [the robots] learn to fight back? Asking for a friend," joked Paul Jorgensen.

hashtags #
worddensity #