Announcement

Collapse
No announcement yet.

Unreal Engine for CAVE™, Multi wall, Cluster systems

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    [PLUGIN] Unreal Engine for CAVE™, Multi wall, Cluster systems

    Hi all,

    we are on a final approach for our first release of vrCluster.

    http://vrcluster.io/

    Our plugin is low level integration into UE rendering pipeline (own implementation of IStereoRendering interface) which gives superior performance in stereoscopic cluster and multi wall environments.

    At a glance:

    - multi pc (cluster), CAVE™, multiwall, stereo support
    - scene objects sync (camera, animations, particles, etc)
    - VRPN input (keys, axis, positioning)
    - opengl quad buffer buffer support
    - vsync, gsync and nvswapsync support
    - asymetric frustums configuration for stereoscopic systems (perfect in 3D!)
    - Blueprints access (vrpn data, load level events, etc)
    - Easy configurable
    - Cluster node listeners for remote launch
    - Full details logging for troubleshooting

    We performs tests using demos available at UE learning center:

    ScifiHall Demo:


    Realistic Rendering Demo:


    Infiltrator ArtDemo:


    All scenes shows superb performance in clustered stereoscopic setup of 8 nodes(pc + projector), 5760 x 1920 + floor (1920x1920).

    thanks!
    Last edited by hell0w0rld123; 08-22-2016, 08:07 AM.

    #2
    Wow! This is quite awesome!
    lucasgovatos.net | Twitter | Epitasis - A Colorful Exploration Puzzle Game

    Comment


      #3
      Originally posted by Higuy8000 View Post
      Wow! This is quite awesome!
      thanks! we will release more videos soon.

      Comment


        #4
        Hi, I'm quite interested in the asymmetric frustum offsets. I had a v4.7 project which could do this, where you set the camera position as if your nose was against the screen, then you could set x,y and z offsets. I'm no c++ expert so I couldn't get it working on later engine versions - is this what your plugin can do? Is it dynamic or just set on begin play? What are your plans for this plugin?

        Comment


          #5
          Originally posted by Dannington View Post
          Hi, I'm quite interested in the asymmetric frustum offsets. I had a v4.7 project which could do this, where you set the camera position as if your nose was against the screen, then you could set x,y and z offsets. I'm no c++ expert so I couldn't get it working on later engine versions - is this what your plugin can do? Is it dynamic or just set on begin play? What are your plans for this plugin?
          In the video I disabled head tracking to make picture more clear.
          Frustums are fully dynamic and updated in realtime via GetStereoProjectionMatrix

          We consider releasing sources at github with flexible licensing options.

          Comment


            #6
            Originally posted by vitaliiboiko View Post
            In the video I disabled head tracking to make picture more clear.
            Frustums are fully dynamic and updated in realtime via GetStereoProjectionMatrix

            We consider releasing sources at github with flexible licensing options.
            This is interesting - I feel my c++ knowledge is letting me down here though. So GetStereoProjectionMatrix will return a ref to an FMatrix which you can then set dynamically is that right? Previously this is something which wasn't possible without an edit to the engine source - is it now something which could be added with a plugin? (Please bear with me - i'm grasping a bit, and out of my depth).

            I was waiting until nVidia's new multi-projection engine build came out - it seems like a nifty move to do this on the graphics card, but it's probably not going to be as easy as just defining an offset - and it's possibly not going to be dynamic.

            When you release your source i'd be really interested in taking a look - I've been working on a kind of virtual studio idea and had it working quite nicely using the tracking on an oculus DK2 - although this was obviously a very limited tracking area. I've now got a vive with it's lighthouse system and want to use a handset transform to drive the frustum offset. I still have my 4.7 project with the adjustable matrix, but 4.7 doesn't have vive support so I can't try it out.

            This is a link to the answerhub thread for that 4.7 build if you're interested.

            Dan

            Comment


              #7
              Now that is really impressive! I haven't seen anyone do a CAVE system like this with UE4. We have one down at the NCSU tech library near Epic I think, so I should probably see if they're interested in trying this out
              Twitch /unrealalexander| Twitter @UnrealAlexander
              How to report a bug? | Installation & Setup issues?
              Call me to a thread by posting this: [MENTION]Alexander Paschall[/MENTION]

              Comment


                #8
                Originally posted by Alexander Paschall View Post
                Now that is really impressive! I haven't seen anyone do a CAVE system like this with UE4. We have one down at the NCSU tech library near Epic I think, so I should probably see if they're interested in trying this out
                cool!
                please let me know how its going

                thanks!

                Comment


                  #9
                  Thats how far i got till now:


                  https://youtu.be/BppGSWJFJLk

                  Comment


                    #10
                    we made a simple website.

                    please subscribe for updates.

                    http://vrcluster.io/

                    Comment


                      #11
                      plugin updated to support unreal engine 4.14.3

                      please subscribe for updates
                      http://vrcluster.io/

                      Comment


                        #12
                        How does the CAVE system like this compare to just a simple VR HMD? First thoughts and instincts make me think that the VR HMD would be more immersive.
                        Columbus Ohio Unreal Meetup
                        @OCGameStudio | Sphere Complex | twitch.tv/WFMOz

                        Comment


                          #13
                          This is amazing!

                          Comment


                            #14
                            Hi. I subscribed to your newsletter, i will be getting this for sure. Quick question, I am trying to blend 2 video projectors that are next to each other... I am assuming this can be done with a post-process material/blend...can you please illuminate me on how to go about it? also when is your plugin coming out? Thank you!

                            Comment


                              #15
                              Cool, when will the plugin available?

                              Comment

                              Working...
                              X