General: How are Sony hoping to get good performance from VR when we find Oculus tricky?

This might be a bit general but…

I have been battling to get good enough performance under VR with my SDK2 Rift on my i7 4.2Ghz, dual (SLI) Nvidia 980 GPUs. Clearly there is optimization that can be done, but how on earth are Sony hoping to get good performance in VR from a console that cant even cope with full 1080 gaming most of the time (and certainly cant match the 4k output of my setup)?

I have read many threads here asking for ways to tweak UE4 performance under the rift and ways to try and hold 75FPS, clearly on a closed system like the PS4 then Sony can control some of these issues but I dont see why my machine (and so many others) are having poor performance. Shadows seem to kill performance but generally things just feel like VR requires too many compromises for anyone other than early adopters/devs at the moment?

Interested to see what others think Sony are planning or where / when we can expect more performance to come from (other than Nvidia finally getting SLI working with VR!).

Sorry if its too general;

Well, they do:)London Heist (Project Morpheus SCE London) - Interrogation Footage - YouTube
First of all, Morpheus uses a 120 Hz panel and they are indeed able to run demos like this with a native framerate of 120 Hz:
Sony Morpheus demos at SVVR expo 2015 - YouTube
But the really smart stuff ist that this allows them to render the game in 60 fps (instead of 75/90 fps) and then *reproject *it to 120 fps (like Timewarping, but processing each frame twice). So they can both save performance and drive super-short persistence on their panels for reduced motion blur.

Then they don’t have to handle huge driver overhead and have full control over the hardware. They don’t have to wait for Vulkan/DX12, they just have access to the bare metal if they want. AFAIK draw calls are handled much faster on the PS4 than on DX11.

Just look on games like Uncharted 4 or Zero Dawn:UNCHARTED 4: A Thief’s End - E3 2015 - Sam Pursuit Gameplay | PS4 - YouTube

With their fixed platform they are able to optimize the hell out of their code, something which just isn’t possible on the PC. That’s why PS4/Morpheus will in fact deliver the same experience as a PC with a GTX 970, if not even better.

In addition, UE4 isn’t really optimized for VR or stereo in general (yet), basically it draws everything twice and then flushes the GPU to prevent frame buffering of the driver, this alone costs about 30-40% performance according to Nick Whiting (Epic). That’s why why all are eagerly awaiting their late latching implementation for UE 4.9 which should give us a huge chunck of this lost performance back on PC. :slight_smile:

Thanks for the thoughts @. Interesting to see the effect of the reproject and the video links you posted. I guess we wait out for UE4.9 and see if we get our performance back - well that and nvidia to finally get SLI working!

Another point is Sony is using a full RGB stripe display, so sony gets a huge performance bonus over Vive and Oculus Rift for free: Vive and the Rift throw away 33% of the final subpixels due to the pentile downscaling of red and blue. Your eyes are more sensitive to green, so it isn’t supposed to be a full 33% loss, but at current VR resolutions the sub pixels are large enough to fall well within your red/blue acuity.

This is an excellent point but 33% sounds like a lot? With the 1.3-1.4x resolution increase needed for Oculus/Vive one would end up at ~1x again (with -33% subpixel resolution in mind)?
Oculus/Vive wouldn’t choose pentile-screens just to save a few $, forcing users to have more capable machines?