

- Video games involving warped reality update#
- Video games involving warped reality software#
- Video games involving warped reality Pc#

Video games involving warped reality software#
"The headset manufacturers have a whole software architecture that is effectively defined in the runtime. This choice reduces the chance of wearer sickness by keeping head tracking fluid, but you do lose out on game information.
Video games involving warped reality update#
This is where we've got options, just like there are options for more traditional frame output and pacing: Redisplay the old frame, or attempt to update the old frame? In VR, the latter might mean an update to head tracking and look, but could sacrifice animation updates. The runtime is still doing its stuff, but it has to make a decision: What does it want to show in the next frame?" Now you've already missed the next interval. Sometimes the game runs too long or the runtime runs too long. The problem is, let's say your GPU is running slower or you have the CPU get busy. It has to get that all done in time for the next refresh. And the next thing that happens is the runtime, that's going to read that texture, and the runtime is going to do things like warping for the lens, or lens warp, and it's also going to do reprojection, which is sort of retiming. The way it works is the game might start rendering, and it's looking at things like taking a headset position and calculating animation, and it might be doing some network stuff, but it's figuring out what's the image that I'm about to put up on the screen. You've got this window, just a small time before that cycle, that you have to be done with that image. "Since it's 90Hz, there's an 11ms step and it happens over and over. Not much time to get the frame together and complete the image. The window during which the runtime executes is only a few milliseconds - it varies based on vendor - so that final crunch frame is something like 3-5ms.

"Lenses are there to allow your eyes to relax and see the image, but those lenses cause distortion when it's on your, so they actually do undistortion modifying the square images to make them more curved to get ready for the lenses."

It does lens correction, which is making the image work with the lenses in the headset, and it's also doing something called 'late warp' or 'reprojection.' The reasons that's happening is to deal with the fact that there's a fixed 11ms window with a refresh of 90Hz that they're trying to hit all the time.
Video games involving warped reality Pc#
But because it's VR, there's lenses and head motion, so the runtime providers - folks like Oculus or Valve - are providing another program that I just call the 'runtime.' The runtime runs in parallel with the game, so the game is rendering a texture that's square, effectively, just like a regular PC game, but then that gets read by the runtime. There's the game, which is the application you're running like Raw Data or something like that, and the game's job is to take physical input, do the simulation, and then to generate a frame. "VR is a little bit more complicated because there's more going on. Of this new pipeline, Petersen explained: Vomiting or loss of balance are a potential side effect from severe enough stuttering. VR is a little different in its pipeline, but more or less offers the same types of negative effects from a missed interval - they're just more drastic, because human biology comes into play. It's a matter of personal preference, but we do generally find that folks are more willing to endure tears than stutters.Īdaptive synchronization technologies have largely resolved both of these issues, by slaving the display's refresh rate to the GPU's frame output, and that's with G-Sync, Freesync, and Fast Sync. Here's an example:įor a lot of gamers, we've gotten used to this. Alternatively, without v-sync, we end up with tearing - the frame output is chaotic enough that frames get "painted" atop one another, potentially causing vertical "tears" in output. Unfortunately, missing one of those intervals means that, with V-Sync, the previous frame is re-displayed and the user experiences what is colloquially known as a "stutter." We're effectively missing information in this scenario, and must wait until the next refresh interval to find out what happens next. Ideally, with a 60Hz display, you're generating those frames about every 16ms (or 8ms on 120Hz) to reduce chance of tearing ( V-Sync eliminates this). In a traditional pipeline, a frame gets rendered and must be sent to the display for presentation to the user.
