NVIDIA hasn't exactly been a household name in VR as of late, with AMD mostly stealing the show with their introduction of Liquid VR and that being one of the focuses of Fiji. NVIDIA has been hard at work with their own VR optimizations, however, to help enable visceral VR experiences.
At Gamescom they have an area entirely devoted to VR. Oculus of course showing EVE: Valkyrie using NVIDIA's hardware and HTC is demoing the Portal VR demo to willing participants, with enough room to actually walk around in-scene. Thus far the reception has been quite ecstatic as these two examples show off the best of what the idea has to offer.
EVE: Valkyrie makes use of 'eye tracking', where you look, with the entire head-mounted display at the enemy ships you want to lock your missiles onto. The fast paced three-dimensional action is well suited to wanting to look around and explore all the pretty (Unreal Engine powered) explosions going on.
The Portal VR demo, though incredibly alpha, has a lot of interaction to offer within the 15Ft2 area you can walk around in. You use your hands to explore the wondrous things before you. Though ultimately limited, the potential is there for a far more complete experience.
GameWorks VR actually has quite a lot of novel features to offer developers and headset makers. Jason Paul was quick to mention the support for what they're calling Multi-Res Shading, a way to render different objects in a scene at different resolutions. That is due to the nature of VR hardware, the optics slightly distorts the image, rounding it, so the image is slightly warped around the edges to compensate. Thus it doesn't necessarily need to be rendered at the same resolution for an ideal pixel density in order to give the same effect and can save rendering power to deliver faster performance for a visually similar (and still quite striking) scene.
Also, GameWorks VR has the ability to assign a single GPU per eye in the case of multiple GPU's, increasing the performance of stereo rendering. But more than that, it enables the explicit control of each GPU in a way enables you to distribute the workload in whatever way you specify.
To help with dizziness and to decrease perceived latency, a great thing called asynchronous timewarp is implemented. This preemptively seeks to adjust the image seen by the user based on where their head is at, even if a new frame hasn't even been completely rendered yet.
And of course there is the direct mode, where Windows doesn't have access to the head mounted display as a traditional monitor so it doesn't start trying to extend onto it (which would make for an interesting experience, as those with the Oculus DK's already have no doubt concluded). The VR device is seen as a peripheral, though a directly controlled by the SDK peripheral. But here there is also access to the front buffer, so that latency is further reduced, hopefully making for a far more smooth experience. While 120FPS would be just darling, a more consistent experience will help against motion sickness and the dizzying effect that often occurs.