Announcement

Collapse
No announcement yet.

Mixed Reality Status update

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

    #16
    Originally posted by wuzelwazel View Post
    I've been mostly successful in implementing mixed reality using this Spout plugin for UE4:

    https://github.com/AleDel/Spout-UE4

    And TouchDesigner:

    https://www.derivative.ca

    I'm sending a keyed texture from TouchDesigner's SpoutOut node into a material instance dynamic in UE4. I'm then reading back a texture from a scene capture 2d (a child of a motion controller with a user defined transform offset) into a TouchDesigner SpoutIn node. The results are visually quite good.

    The only issues I'm currently having are:

    1. The UE4 plugin seems to have some sort of memory leak with the Spout Receiver (no issues with the Spout Sender).

    2. I don't know how to delay the game to match the latency of the incoming keyed texture.

    No. 2 is much easier to handle if the composite is done outside of the engine. With the texture being composited inside the engine I need some way to add a custom amount of latency to everything in the game except the texture. I have no idea how to do that.

    Does anyone have a suggested approach to solving this latency issue?

    I'd also come very close to implementing a composite outside of the engine with 2 scene capture 2d components, one sending the tonemapped RGB and another sending a modified depth pass to separate foreground from background. Unfortunately that method doesn't work at all with translucent materials.
    Isn't it enough to just write the results of a scene capture2d to the screen and use OBS to handle the compositing?

    For that you just would need to edit the steamvr renderer code and add an additional option to draw a custom texture to the screen.

    Comment


      #17
      Originally posted by Azarus View Post
      Isn't it enough to just write the results of a scene capture2d to the screen and use OBS to handle the compositing?

      For that you just would need to edit the steamvr renderer code and add an additional option to draw a custom texture to the screen.
      OBS is not a good compositor. I did have a fully developed system of compositing in post using TouchDesigner with occlusions and relighting but discovered to my dismay that translucent materials do not render into the world position or depth passes (which I was using to handle occlusion and relighting in TD).

      The benefits of 'compositing' in engine are quite nice. The player is lit by game lights and even reflected and refracted.

      Comment


        #18
        I see and its nice solution. I was just suggesting an "easier" solution that works too. As far as i saw in other Mixed Reality videos they had their background set up properly wit green background and stuff. For that i'm pretty sure obs or other video streaming software would be a good fit. I was unable to test this properly due to the lack of a proper camera.

        Also this feature is often used for trailers and youtubers, so they might be able to afford these required assets for such a video

        Can you please post a picture or a short video about your solution, i am interested in the results you've got :$ wondering how different it is with reflections

        Comment


          #19
          I'd also love to see how well lighting in game works. Do you delight the incoming camera feed somehow?

          Comment


            #20
            FYI now labelled as Feb.: https://trello.com/c/plfg7Dio/871-vr...reality-movies
            lets give it some more upvotes!
            Headgear - VR/AR solutions

            Comment


              #21
              They'll announce this feature at GDC 2017.
              Steve Biegun
              Virtual Design Manager

              Comment


                #22
                Originally posted by LokiDavison View Post
                I'd also love to see how well lighting in game works. Do you delight the incoming camera feed somehow?
                I'm slammed at my day job right now, but hopefully I can put some images and videos together early in the new year.

                I am not de-lighting the incoming footage although that is somewhat possible. Because the footage is currently applied to a flat plane the lighting in game should be considered supplemental to the real world lighting in the source footage. I would say that a high key lighting setup for the green screened subject could provide the right balance between perceived depth (lighting in game is occurring on a flat plane, no way to self shadow) and presence in the environment.

                I've built in controls to blend between the unlit source footage and the lit result in engine. It's also possible to color correct the footage before sending it into the engine.

                Comment


                  #23
                  I'm looking forward to the videos. Alas I need it for CES in early Jan so I'm probably going for the simpler option as I've already got some thing composited in OBS working. What do you do for camera calibration? Do you read in files in the standard Vive calibration format?

                  Comment


                    #24
                    Originally posted by LokiDavison View Post
                    I'm looking forward to the videos. Alas I need it for CES in early Jan so I'm probably going for the simpler option as I've already got some thing composited in OBS working. What do you do for camera calibration? Do you read in files in the standard Vive calibration format?
                    CES is the reason I'm so slammed right now!

                    For camera calibration I save the camera variables in a saved game file. I have in game models of the Vive controller being tracked as well as the camera it's attached to. I do calibration itself in three steps:

                    1. Have someone wear the Vive and try to reach out and touch the virtual camera model's lens with their controller. As they're doing this someone at the computer is interactively adjusting the camera position offsets using key inputs. Once the 'player' is touching the real camera lens the position can be assumed to be as accurate as necessary and shouldn't need adjustment in future steps.

                    2. While viewing the composite adjust the rotation offsets of the camera interactively until the physical controllers roughly align with the virtual controllers (it's useful at this stage to have vive controller models that you can temporarily enable in game to align).

                    3. While viewing the composite adjust the focal length of the virtual camera until the physical controllers are fully aligned at the correct scale. Go back to step 2 and make any further adjusments necessary.

                    I find that by following the steps in this order it becomes a much easier process. It also helps me to think of the rotation adjustment step as sliding the virtual frame in 2D under the physical footage and the focal length step as scaling the virtual frame. The position of the camera is the only thing that really affects perspective and by having the player reach out and touch the physical camera we've set the perspective fairly accurately from step 1.

                    Of course this is not really possible through editing a text file, it's only feasible when you can modify values interactively in-game.

                    Comment


                      #25
                      I was planning to use something like http://tribalinstincts.com/mixedrealityconfigurator/ to write out a "standard" file and then load that in, so I wouldn't need to build that adjustment behaviour into my application. Having a nice work flow for calibration would be good but I'd love to allow users to reuse it between applications. Plus I'd like to try saving ones for our different lenses, if the rest of the setup stay static enough.

                      Comment


                        #26
                        Upvoted! Absolutely critical to make highquality promo material, and perhaps some other uses as well... https://trello.com/c/plfg7Dio/871-vr...reality-movies

                        Comment


                          #27
                          We are also waiting for it. I hope we will get it soon.
                          Augmented Reality Company | Virtual Mirror | Mobile Augmented Reality

                          Comment


                            #28
                            The good guys at Epic should please understand that a lot of Youtubers don't review your game if they can't do mixed reality videos. PLEASE FIX MIXED REALITY for UE4!

                            Comment


                              #29
                              +1 for this!

                              Comment


                                #30
                                No mention of it in the 4.15 P1 release notes
                                Headgear - VR/AR solutions

                                Comment

                                Working...
                                X