Here’s the excerpt from Edge:
But latency is the thing that gets the most attention. And while it’s already proven to be more than playable, [Madj Bakar, VP of Engineering] expects further improvements. “Ultimately, we think, in a year or two, we’ll have games that are running faster and feel more responsive in the cloud that they do locally, regardless of how powerful the local machine is,” he claims. These improvements will come via a term which sounds rather slippery. “Negative latency” is a concept by which Stadia can set up a game with a buffer of predicted latency between the server and player, and then use various methods to undercut it. It can run the game at a super-fast framerate so it can act on player inputs earlier, or it can predict a player’s button presses. These tricks can help the game feel more responsive, potentially more so than a console game running locally at 30fps with a wireless controller.
Now, the “negative latency” moniker definitely sounds far-fetched. But there could be something to the idea of combining high frame rates with a buffer. And with Google’s experience in AI and machine learning, it’s not out of the question that Stadia could predict button presses.
Bakar is making a very specific comparison between cloud-based gaming versus consoles running at 30 frames per second and using a wireless controller. So it’s obviously a stretch that this will be an apples-to-apples use case in the real world. Even in general, gamers will be raising their eyebrows at the concept of game streaming until we’re able to test it in the wild. Luckily, that’s only a month away, as Google Stadia will launch sometime in November.