Close this search box.

Gran Turismo Built an AI That’s Better Than Any Player

Nicknamed ‘Sophy’, the superhuman racing AI was a joint project between Sony and Gran Turismo‘s developers Polyphony Digital. The result was a racer that became virtually unbeatable through deep reinforcement learning technology. Whatever that means!

Sophy has been in the pipeline for nearly 2 years now with the game’s developers constantly testing the AI against human drivers in various online racing leagues. While Sophy very quickly became faster and more accurate than its human counterparts, the same could not be said of its sportsmanship. The AI blitzed through the game with little concern for its human competitors, often driving like a (insert expletive word of choice here).

In saying that, Sophy isn’t just a robot that learned to drive in a game. Thanks to Gran Turismo’s constant focus on realism, Sophy actually understands real world driving physics! It learned how to race by being either positively or negatively reinforced based on its in game behaviour. Complex algorithms and scenarios were run over thousands of hours to allow the AI to perfect its craft! Just like a human would take 10,000 hours to master a new skill.

While I’m very much looking forward to the release of Gran Turismo 7, I don’t fancy having my rear handed to me by an unbeatable AI. Anyway, here’s a final gameplay trailer of GT7 to keep you virtual racers on your toes!

Thanks for reading! For more news and reviews, check out Tarmac Life.

Words by Matthew D’Souza, pictures courtesy of Gran Turismo Media.

Share your love


Support our advertisers

Paying bills

Ads from the Googles

Support our advertisers

Leave a Reply

Your email address will not be published. Required fields are marked *