Announcement

Collapse
No announcement yet.

VR for projection environments?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    VR for projection environments?

    I see posts here about work integrating the Rift into Unreal. I was wondering if anyone was working on integration with (multiple) projection surfaces. I work in a lab with access to a CAVE system. I have a student over the summer working on getting Unity in a CAVE, and I'd like to get something Unreal in there as well. Unfortunately, I have zero Unreal development experience, so other than having already purchased the UE4 source, I'm overwhelmed on where to start.

    Things I'm interested in...

    - Active stereoscopic 3D (frame interlaced); either OpenGL or D3D (Win7 is strongly preferred)
    - Multiple non-planar display surfaces
    - User head and hand tracking. Integration with VRPN or TrackD preferred.
    - Asymmetric camera projection frustums. Combine with head tracking to create 1:1 scale view.
    - Multi-host cluster support large numbers of display outputs.

    The closest thing I've seen is CaveUDK (http://hci.uni-wuerzburg.de/projects/caveudk.html) which is awesome, but not available for public release.

    Thanks.

    Mike

    #2
    +1 to this!

    Comment


      #3
      I'm interested VRPN integration into in UE4

      I have no experience with or access to a CAVE system, but I will happily collaborate on a VRPN plugin, and I have previous UE4 InputDevice plugin experience

      Comment


        #4
        While we haven't done any projection environment work here, most of the hooks that you'd need should be available to do this kind of work.

        - Active stereoscopic rendering
        We only support side-by-side at the moment, but you'd be able to potentially expand our stereo rendering code to do it without too much effort. Take a look at FFakeStereoRenderingDevice in UnrealEngine.cpp to see a minimal implementation of a stereo device in the engine. If you expand upon that, you should be able to do interlacing by doing nothing in the AdjustViewRect function (to keep things full screen), and then calculating the StereoViewOffset every other frame. The interface might be slightly cleaner if you extended it to have another EStereoscopicPass for interlacing.

        - Multiple display surfaces
        Depending on how your video setup handles this, it could be easy or difficult If the system works by slicing the same buffer, you could look at how we do AdjustViewRect to see how to render to different areas of one render target. If you require different outputs (multiple computers), you might have to get creative with using multiple clients connected to a server, or setting up multiple viewports.

        - User head and hand tracking
        Head tracking is already built in to our HMD templates, so that should be pretty easy to figure out. Hand tracking isn't something we've done much of internally. Our input devices support arbitrary axes binding, so I'd recommend hooking it up through that system, so that you can use all of our internal input system directly.

        - Asymmetric camera projection
        We have done a bit of this, all going through the same IStereoRendering interface, and then modifying the GetStereoProjectionMatrix() to return an asymmetric projection.

        - Multi-host cluster
        Again, you probably want to do something where you set up one server, and then the other clients join in to that server, but everyone targets the same player pawn. You can have as many cameras as you want on a pawn, so you can make each client select the one appropriate for their view position. All the same stereo rendering functions will apply, no matter camera you're using. Check out APlayerCameraManager::SetViewTarget for a good place to start!

        Hope that helps!

        Comment


          #5
          Guess I wasn't subscribed correctly. I thought this thread was dead. Glad to see there's some interest here.

          mspe044,

          A VRPN plugin would be awesome. Unfortunately, I don't have much (almost none) VRPN experience, but I have a bunch of experience using (some device integration) with TrackD which is conceptually similar.

          Nick,

          I need some sort of tutorial on the UE4 architecture and rendering pipeline, before commenting too much. You have some good pointers here. Thanks!

          I think the stereo setup might be more complicated than I'm interpreting your comments. You need some sort of support from the card to support frame interlaced stereo (eg. Quadbuffered) and a stereo sync output for a 3D glasses emitter. The video frame rate will generally be much higher than the game frame rate.

          Appreciate the help.


          Mike

          Comment


            #6
            Originally posted by PSU Mike View Post
            Guess I wasn't subscribed correctly. I thought this thread was dead. Glad to see there's some interest here.

            mspe044,

            A VRPN plugin would be awesome. Unfortunately, I don't have much (almost none) VRPN experience, but I have a bunch of experience using (some device integration) with TrackD which is conceptually similar.

            Nick,

            I need some sort of tutorial on the UE4 architecture and rendering pipeline, before commenting too much. You have some good pointers here. Thanks!

            I think the stereo setup might be more complicated than I'm interpreting your comments. You need some sort of support from the card to support frame interlaced stereo (eg. Quadbuffered) and a stereo sync output for a 3D glasses emitter. The video frame rate will generally be much higher than the game frame rate.

            Appreciate the help.


            Mike
            Please let me know if you make any progress on this Mike. For unity I can strongly recommend MiddleVR from Im.in.VR you can contact Sebastien directly, I am currently trying to inspire him to create a UE4 compatible versions as well. What university are you with Mike? Perhaps we could talk about VR at somepoint?
            http://www.imin-vr.com/middlevr-for-unity/
            kind regards
            Steve

            Comment


              #7
              Originally posted by Nick Whiting View Post
              - Multiple display surfaces
              Depending on how your video setup handles this, it could be easy or difficult If the system works by slicing the same buffer, you could look at how we do AdjustViewRect to see how to render to different areas of one render target. If you require different outputs (multiple computers), you might have to get creative with using multiple clients connected to a server, or setting up multiple viewports.
              Thanks for the info, Nick. I'm more of an artist and less a coder, so I'm going to try my best to get this working, as I really need it. A few questions that may seem simple, but please bear with me:

              - If I'm following correctly, it sounds like I need to write my own AdjustViewRect function, since the one you seem to be referring to was specifically written for Stereo Rendering Devices. I'd like to split the screen into three rectangles and assign a camera to each of them. Will I need to write my own class, similar to FakeStereoRenderingDevice?

              - Should I modify the source code, or is this something that should be done in a C++ game template? (I'm guessing the former...)

              Thanks in advance for your help and patience!

              Comment


                #8
                Hi all,
                we have started a MiddleVR integration into UE4.

                The goal is to get improved VR support for professional applications. For example, to be able to use any VR device, use 3D-TV, stereoscopic walls, HMDs and Caves of course.
                We are also bringing WebPage rendering inside the virtual world, immersive menus etc:
                https://www.youtube.com/watch?v=6EVe8I65cuk

                All those interested please let us know, it will help us move faster if we know your needs!

                Here is more info on MiddleVR: http://www.imin-vr.com/middlevr/

                Have a great day,
                Sebastien

                Comment


                  #9
                  Hey all,

                  I'm a student attempting to run active stereoscopic for a university project and I tried Nick's recommendation about doing nothing in the AdjustViewRect function and only a black screen appears.
                  The HUD still renders but the game itself does not render and after about 3 hours of going through the engine code, I could use some help.

                  Can somebody hopefully point me in the right direction?

                  AJ Abotomey
                  Griffith University Student

                  Comment


                    #10
                    Definitely interested Sebastien, thank you.
                    regards
                    Stephen

                    Comment


                      #11
                      I'm interested in the asymmetric camera fustrum. Only I didn't know what it was called until now. Does UE4 have this functionality built in? I'm trying to make a virtual set.

                      Dan

                      Comment


                        #12
                        +1 for this Sebastien!

                        We are also interested to create a Virtual Reality room for some scenarios for Unreal Engine. We want to show the potential of VR in an innovation center for education.

                        Comment


                          #13
                          We are currently in beta for an initial MiddleVR for UE4 version in a Cave!
                          Contact us if you're interested: contact (at) middlevr (dot) com

                          Here's a video to get you interested:



                          cb

                          Comment


                            #14
                            Hi, I am also working on an adaptaion of UE4 into a CAVE
                            here is my thread https://forums.unrealengine.com/show...232#post319232 (speaking about the multi GPU capabilities...)

                            and here is some video,



                            the results are pretty amazing thanks to the UE4 renderings.
                            I still have to work on
                            - active stereo (the system is currenlty using passive stereo system)
                            - collision detection while navigating
                            - event management between UE4 visual thread

                            Comment


                              #15
                              How did you manage to places all these views to different monitors? i am doing something similiar right now and could need some code examples.

                              I need some advice about how to use quad buffering. I read a thread that said that it was implemented in 4.8 but i cant find any further information about that unfortenately.

                              Here is the thread: https://forums.unrealengine.com/show...r-wall-display

                              Thanks in advance.

                              Comment

                              Working...
                              X