Our plugin is low level integration into UE rendering pipeline (own implementation of IStereoRendering interface) which gives superior performance in stereoscopic cluster and multi wall environments.
At a glance:
multi pc (cluster), CAVE™, multiwall, stereo support
scene objects sync (camera, animations, particles, etc)
VRPN input (keys, axis, positioning)
opengl quad buffer buffer support
vsync, gsync and nvswapsync support
asymetric frustums configuration for stereoscopic systems (perfect in 3D!)
Hi, I’m quite interested in the asymmetric frustum offsets. I had a v4.7 project which could do this, where you set the camera position as if your nose was against the screen, then you could set x,y and z offsets. I’m no c++ expert so I couldn’t get it working on later engine versions - is this what your plugin can do? Is it dynamic or just set on begin play? What are your plans for this plugin?
This is interesting - I feel my c++ knowledge is letting me down here though. So GetStereoProjectionMatrix will return a ref to an FMatrix which you can then set dynamically is that right? Previously this is something which wasn’t possible without an edit to the engine source - is it now something which could be added with a plugin? (Please bear with me - i’m grasping a bit, and out of my depth).
I was waiting until nVidia’s new multi-projection engine build came out - it seems like a nifty move to do this on the graphics card, but it’s probably not going to be as easy as just defining an offset - and it’s possibly not going to be dynamic.
When you release your source i’d be really interested in taking a look - I’ve been working on a kind of virtual studio idea and had it working quite nicely using the tracking on an oculus DK2 - although this was obviously a very limited tracking area. I’ve now got a vive with it’s lighthouse system and want to use a handset transform to drive the frustum offset. I still have my 4.7 project with the adjustable matrix, but 4.7 doesn’t have vive support so I can’t try it out.
Now that is really impressive! I haven’t seen anyone do a CAVE system like this with UE4. We have one down at the NCSU tech library near Epic I think, so I should probably see if they’re interested in trying this out
How does the CAVE system like this compare to just a simple VR HMD? First thoughts and instincts make me think that the VR HMD would be more immersive.
Hi. I subscribed to your newsletter, i will be getting this for sure. Quick question, I am trying to blend 2 video projectors that are next to each other… I am assuming this can be done with a post-process material/blend…can you please illuminate me on how to go about it? also when is your plugin coming out? Thank you!
I was able to get the uvr1 looking at the right path of 1pc_8instances.cfg and the viewport changed but I was not able to open more than viewport, also are there any blending capabilities within the system?