Hyper-capable AIs have been beating us at our own games for years. Whether it’s Go or Jeopardy, DOTA 2 or Nethack, artificial intelligences have routinely proven themselves superior competitors, helping advance not only the state of gaming arts but also those of machine learning and computational science as well. On Wednesday, Sony announced its latest addition to the field, GT Sophy, an AI racer capable of taking on — and beating — some of the world’s best Gran Turismo players.
GT Sophy (the GT stands for “Gran Turismo”) is the result of a collaboration between Sony AI, Polyphony Digital (PDI) and Sony Interactive Entertainment (SIE), as well as more than half a decade of research and development.
“Gran Turismo Sophy is a significant development in AI whose purpose is not simply to be better than human players, but to offer players a stimulating opponent that can accelerate and elevate the players’ techniques and creativity to the next level,” Sony AI CEO, Hiroaki Kitano, said in a statement Wednesday. “In addition to making contributions to the gaming community, we believe this breakthrough presents new opportunities in areas such as autonomous racing, autonomous driving, high-speed robotics and control.”
Utilizing as novel deep reinforcement learning method, the research team taught its AI agent how to control a digital race car within the structure of the GT game, helping Sophy to understand vehicle dynamics and capabilities, as well as racing tactics like slipstreaming, passing and blocking overtakers and basic track etiquette.
“To drive competitively GT Sophy had to learn to control the car at the physical limit, optimize for braking and acceleration points, as well as find the right lines that squeeze the last tenth of a second out of the track,” Michael Spranger, COO of Sony AI, said during the presentation. “But raising also means that you’re not alone on the track, so Sophie has to find lines to pass opponents, taking into account the opponent’s reaction, as well as complex, aerodynamic interactions between cars.”
Sony trained its AI using deep reinforcement learning to optimize its ability to stay on track. “Sophy observes the environment, such as the car speed and acceleration, the relative position, of course borders and opponents, as well as the progress of the car along the track,” Spranger explained. “Based on these inputs, GT Sophy learns to take actions, such as using a throttle steering or braking.”
“To learn,” he continued, “Sophy gets a positive signal — a reward — when things are going well, when it is making focus on the track and overtaking other cars. [Sophy receives] a negative signal when things are not going well through continuous interaction with the game.”
The initial results were impressive, with Sophy beating 95 percent of the humans pitted against her within the first two days of training. What’s more, the AI continued to shave time off of her splits throughout the following week. In an exhibition race Wednesday against some of Japan’s top Gran Turismo drivers — with four Sophy variants going up against a quartet of humans — the AI took the checkered flag and two of the top four positions on the game’s Lago Maggiore circuit. The winning AI agent began the race in pole position and stayed there through all three laps, eventually beating the pack by more than five seconds. The AIs were not running on rails, however, as evidenced by one agent misjudging a passing attempt then understeering through a turn and promptly running head on into the wall and out of competition.
“This is not just a technical breakthrough project,” Sony Group CEO, Kenichiro Yoshida, said during Wednesday’s press event. “It really is about bringing AI into the hands of the game developers who are going to build new experiences for the players.”
Players will soon be able to pit themselves against Sophy, and potentially have her on hand as either an in-game driving coach or in-race teammate. Gran Turismo 7 for the PS4 and PS5 will be released on March 4th and Sony executives expect the AI to be added in a future update.