Unreal Engine for CAVE™, Multi wall, Cluster systems

Hi all,

we are on a final approach for our first release of vrCluster.

http://vrcluster.io/

Our plugin is low level integration into UE rendering pipeline (own implementation of IStereoRendering interface) which gives superior performance in stereoscopic cluster and multi wall environments.

At a glance:

  • multi pc (cluster), CAVE™, multiwall, stereo support
  • scene objects sync (camera, animations, particles, etc)
  • VRPN input (keys, axis, positioning)
  • opengl quad buffer buffer support
  • vsync, gsync and nvswapsync support
  • asymetric frustums configuration for stereoscopic systems (perfect in 3D!)
  • Blueprints access (vrpn data, load level events, etc)
  • Easy configurable
  • Cluster node listeners for remote launch
  • Full details logging for troubleshooting

We performs tests using demos available at UE learning center:

ScifiHall Demo:

Realistic Rendering Demo:

**Infiltrator ArtDemo: **

All scenes shows superb performance in clustered stereoscopic setup of 8 nodes(pc + projector), 5760 x 1920 + floor (1920x1920).

thanks!

Wow! This is quite awesome!

thanks! we will release more videos soon.

Hi, I’m quite interested in the asymmetric frustum offsets. I had a v4.7 project which could do this, where you set the camera position as if your nose was against the screen, then you could set x,y and z offsets. I’m no c++ expert so I couldn’t get it working on later engine versions - is this what your plugin can do? Is it dynamic or just set on begin play? What are your plans for this plugin?

In the video I disabled head tracking to make picture more clear.
Frustums are fully dynamic and updated in realtime via GetStereoProjectionMatrix

We consider releasing sources at github with flexible licensing options.

This is interesting - I feel my c++ knowledge is letting me down here though. So GetStereoProjectionMatrix will return a ref to an FMatrix which you can then set dynamically is that right? Previously this is something which wasn’t possible without an edit to the engine source - is it now something which could be added with a plugin? (Please bear with me - i’m grasping a bit, and out of my depth).

I was waiting until nVidia’s new multi-projection engine build came out - it seems like a nifty move to do this on the graphics card, but it’s probably not going to be as easy as just defining an offset - and it’s possibly not going to be dynamic.

When you release your source i’d be really interested in taking a look - I’ve been working on a kind of virtual studio idea and had it working quite nicely using the tracking on an oculus DK2 - although this was obviously a very limited tracking area. I’ve now got a vive with it’s lighthouse system and want to use a handset transform to drive the frustum offset. I still have my 4.7 project with the adjustable matrix, but 4.7 doesn’t have vive support so I can’t try it out.

This is a link to the answerhub thread for that 4.7 build if you’re interested.

Dan

Now that is really impressive! I haven’t seen anyone do a CAVE system like this with UE4. We have one down at the NCSU tech library near Epic I think, so I should probably see if they’re interested in trying this out :smiley:

cool!
please let me know how its going

thanks!

Thats how far i got till now:

:frowning:

we made a simple website.

please subscribe for updates.

http://vrcluster.io/

plugin updated to support unreal engine 4.14.3

please subscribe for updates
http://vrcluster.io/

How does the CAVE system like this compare to just a simple VR HMD? First thoughts and instincts make me think that the VR HMD would be more immersive.

This is amazing! :smiley:

Hi. I subscribed to your newsletter, i will be getting this for sure. Quick question, I am trying to blend 2 video projectors that are next to each other… I am assuming this can be done with a post-process material/blend…can you please illuminate me on how to go about it? also when is your plugin coming out? Thank you!

Cool, when will the plugin available?

Good question TheFoyer!

One of the major advantage is the full perception of 1 to 1 scale of objects.

There are multiple pros and cons, maybe will put an article about such comparison one day!

please, send me a mail to @vrcluster.io

plugin available for early adopters

Here we go!

https://github.com/vrCluster/vrCluster

please feel free to fork, play and request help.

For commercial purposes please send inquiries to @vrcluster.io

We are working hard to provide clear documentation.

Hi Vitaliiboiko

maybe a quick tutorial?

I was able to get the uvr1 looking at the right path of 1pc_8instances.cfg and the viewport changed but I was not able to open more than viewport, also are there any blending capabilities within the system?

Thank You!!!

best

thats strange, I will double check if github repo contains right config demo files.

what exactly do you mean under “blending capabilities”?