Hi All,
I’ve been playing around recently with V-Ray’s new stereoscopic VR rendering features, and they’re pretty great! They’re really nice for backgrounds, heavily complex scenes and mobile VR. I was wondering if anyone has every tried to get them to work in unreal, where each eye receives the correct imagery to display? So far, there is a lot of discussion on getting unreal to export 360 video/imagery (which would also be really great), but less on getting it to view it.
As someone that would like to push VR fidelity, make a lot of fast iterations, and who works in mobile, stereoscopic renders would be a great addition. Ideally, I’d experiment with using them for mid/backgrounds, and working with single-perspective scenes on mobile devices.
For reference, here’s an example of a 6x1 VR cubemap, produced by Chaos Group. http://cadpoint.co.uk/img//ChaosGroup/newinVray.3.2/Steelblue_V-Ray_CubicVR_download.jpg
Each eye renders its own cube.
I’d really appreciate anyone’s thoughts on bringing this into Unreal. My code-fu is basic; I’m an artist/architect by training, so am approaching this from a less-technical background.
Cheers All,
Alistair.