Sony’s Gran Turismo AI Demolishes the World’s Best Human Drivers


    
    A Sony AI system called GT Sophy races in the Gran Turismo Sport video game.
    Sony AI/Screenshot by Stephen Shankland/CNET
    Over the last two years, Sony AI trained a computer system to play Polyphony Digital’s Gran Turismo Sport, a popular and realistic car racing game, and beat some of the world’s best human competitors. And that technology could be coming to a racing game near you.
    The AI, named GT Sophy, defeated top humans only in time trials when there were no other cars on the track during a July competition, Sony said on Wednesday. But by October, GT Sophy beat the humans even with a scrum of virtual race cars.
    “Our goal is to develop AIs that make games more fun for all levels of players,” Sony AI Director Peter Wurman told CNET. “We are looking forward to working with Polyphony to figure out the best ways to incorporate this technology into Gran Turismo in the future.”
    GT Sophy is the latest experiment demonstrating that AI can be victorious at games such as chess and Go, which were long thought to be the domain of human intelligence. AI has also beaten people at classic Atari video games and the Starcraft real-time strategy game.?
    AI today generally refers to a process for programming computers using a technology known as neural networks, which mimic the way human brains work. Sony’s achievement is notable enough to warrant a research paper in the prestigious journal Nature.
    A car racing video game like Gran Turismo?presents open-ended tactical choices as well as simulated rules of physics. GT Sophy picked new ways to approach them, one of the human competitors said.
    “The AI drives in a way that we would never have come up with,” said Takuma Miyazono, who won three challenges in the FIA Gran Turismo 2020 World Finals, speaking in a video. He said GT Sophy’s tactics made sense when he saw it drive.?
    
AI gets smarter

  • DeepMind’s AI can now crush almost every human player in StarCraft 2
  • Meta has a giant new AI supercomputer to shape the metaverse
  • Chipmakers supercharge AI with giant processor brains


    Many AI systems are trained with real-world data through a system called deep learning that gives them the ability to recognize faces and to spot spam. GT Sophy used a different technique called reinforcement learning that starts with an entirely untrained system that has no idea what to do. It raced courses over and over again, following a human-designed reward system that encouraged better results and eventually mastered the game.
    Training an AI to play Gran Turismo was more challenging than earlier video games because a small mistake can be catastrophic and because it’s tough to handle the unwritten rules of car racing, such as avoiding collisions and not inappropriately cutting off other drivers.
    “Unlike these other games, where the rules are enforced by the game environment, a big part of the rules of racing are subjective and, at the highest levels, enforced by human judges,” Wurman said. “This subtle objective to be a good sport and still be bold enough to try to win the race is unlike anything that exists in other games.”
    Sony AI ran simulations on personal computers connected to a bank of more than 1,000 PlayStation 4 game consoles. It trained different agents for specific cars and tracks, with each agent consuming the processing power of 20 PlayStations, 20 PCs and one high-end graphics chip, Wurman said.
    Human players can use ordinary game controllers to race in the game, though competitors prefer advanced setups with steering wheels. GT Sophy, though, used an all-electronic programming interface Polyphony Digital exposed for the project.
    The next in the venerable series, Gran Turismo 7, debuts on March 4.
    Get an early look at Gran Turismo 7 for PS5