AI Can Now Also Outrace Human Champs in the Videogame 'Gran Turismo'
Scientific American reports: To hurtle around a corner along the fastest "racing line" without losing control, race car drivers must brake, steer and accelerate in precisely timed sequences. The process depends on the limits of friction, and they are governed by known physical laws — which means self-driving cars can learn to complete a lap at the fastest possible speed (as some have already done). But this becomes a much knottier problem when the automated driver has to share space with other cars. Now scientists have unraveled the challenge virtually by training an artificial intelligence program to outpace human competitors at the ultrarealistic racing game Gran Turismo Sport. The findings could point self-driving car researchers toward new ways to make this technology function in the real world. Artificial intelligence has already conquered human players within certain video games, such as Starcraft II and Dota 2. But Gran Turismo differs from other games in significant ways, says Peter Wurman, director of Sony AI America and co-author of the new study, which was published this week in Nature. "In most games, the environment defines the rules and protects the users from each other," he explains. "But in racing, the cars are very close to each other, and there's a very refined sense of etiquette that has to be learned and deployed by the [AI] agents. In order to win, they have to be respectful of their opponents, but they also have to preserve their own driving lines and make sure that they don't just give way." To teach their program the ropes, the Sony AI researchers used a technique called deep reinforcement learning. They rewarded the AI for certain behaviors, such as staying on the track, remaining in control of the vehicle and respecting racing etiquette. Then they set the program loose to try different ways of racing that would enable it to achieve those goals. The Sony AI team trained multiple different versions of its AI, dubbed Gran Turismo Sophy (GT Sophy), each specialized in driving one particular type of car on one particular track. Then the researchers pitted the program against human Gran Turismo champions. In the first test, conducted last July, humans achieved the highest overall team score. On the second run in October 2021, the AI broke through. It beat its human foes both individually and as a team, achieving the fastest lap times.... "The lines the AI was using were so tricky, I could probably do them once. But it was so, so difficult — I would never attempt it in a race," says Emily Jones, who was a world finalist at the FIA-Certified Gran Turismo Championships 2020 and later raced against GT Sophy.... "Racing, like a lot of sports, is all about getting as close to the perfect lap as possible, but you can never actually get there," Jones says. "With Sophy, it was crazy to see something that was the perfect lap. There was no way to go any faster." The article notes that Sony AI is now working with Gran Turismo's developer (the Sony Interactive Entertainment subsidiary Polyphony Digital) to potentially incorporate a version of their AI into a future update of the game. "To do this, the researchers would need to tweak the AI's performance so it can be a challenging opponent but not invincible..."
from Slashdot https://ift.tt/XMjAnLg
Read more of this story at Slashdot.
from Slashdot https://ift.tt/XMjAnLg
0 Response to "AI Can Now Also Outrace Human Champs in the Videogame 'Gran Turismo'"
Post a Comment