Sony AI, a research and development division of the multimedia giant, has had its first AI breakthrough, and it makes the rest of us look like Sunday drivers. Called Gran Turismo Sophy, the AI was trained by Gran Turismo 7‘s hyper-realistic simulated racetracks to be one of the best digital race car drivers around.
Revealed in a blog post from Sony AI, Gran Turismo Sophy is a major departure from previous game-developed AI. Citing examples of programs that can play games like Chess or Go, Sophie instead plays a hyper-realistic physics-based sim, not something that’s constant. In Chess, for instance, the board doesn’t suddenly become slippery, causing pieces to fall out of place.
Sophy instead learned how to race through reinforcement learning techniques, according to the AI’s website. Through this machine learning process, the AI is given a reward or penalty for actions it takes, eventually teaching it new techniques. Through this process, Sophy went from being a sloppy driver to learning how to control a car in Gran Turismo 7 and push it to its limits. Sophy can, for instance, take corners at the last possible moment, and use sharp corners to easily overtake other drivers.
For anyone who plays Gran Turismo 7, Sophy will be a challenge to beat, as it brings out techniques that surprise even professionals. Sony stacked up the AI against FIA Gran Turismo Championships 2020 finalist Emily Jones, who was summarily left in the dust.
But according to a report from Wired on Sophy, the AI could eventually be used for more than challenging the best racing sim players. Instead, it has the potential for driving forward the development of self-driving vehicles.
Gran Turismo 7 is set to launch on March 4 for PS4 and PS5.