Announcement

Collapse
No announcement yet.

Mixed Reality Status update

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • [VIVE] Mixed Reality Status update

    Does anyone have a status update about when mixed reality video recording will be possible with UE?

    Basically I'd like to know if there is a branch where the 2D window can be set to another camera than the VR camera. Last update I can find is this: https://twitter.com/EpicJamesG/statu...761597440?s=09

  • #2
    In a RoadtoVR article, it was implied that this would be in 4.13. Looks like we'll have to wait until 4.14 now. No news yet.
    Steve Biegun
    Virtual Design Manager

    Comment


    • #3
      4.13 did add a way to designate a motion controller as controlling a camera transform. Nothing with rendering or viewports though. Networking is unfortunately still the easiest way.

      This pull request from Allar might get you part of the way towards what you want: https://github.com/EpicGames/UnrealEngine/pull/2444

      I'm not sure if it is possible to set up a real viewport, or if it would need to be based on a scene capture component.

      Comment


      • #4
        I eventually want something similar to what is available with Unity in that it spits out multiple layer so you can have objects in front and behind the camera footage. For the moment anything would be an improvement. Networking is hugely fiddly for an end user to setup and that's the main use case for us.

        Comment


        • #5
          I'm testing our this pull request at the moment. In the rejection of it Epic staff mention that they are implementing this at the moment but I'm not sure when that is going to be available.

          https://github.com/EpicGames/UnrealEngine/pull/2727 it's rendering a scene capture to the mirror window.

          Comment


          • #6
            I'm just using this for making non-mixed reality b roll footage at the moment but it's working pretty well. Performance is okay at least when using the Oculus forward rendering branch.

            Comment


            • #7
              I've had a simple solution for mixed reality going for a while based on the hack I submitted above and some obs xsplit related tom foolery to sync the video feeds. My solution allows for real-time compositing on one machine but because of limitations in the post process system I can't replicate what the unity guys are doing without major engine changes. The render to texture, post process and window control areas of the engine seem to be disappointingly under developed. This is a list of things that are holding me back from equivalent results.

              1. No alpha support for post process blendables. I know the plan is to replace the whole post process system at some point but having this would make support for render to textures much more acceptable.
              2. Capture multiple render targets using different post process settings from a single USceneCaptureComponent. When the post process is run there is a huge amount of data about the frame available but you can only output one texture as a result. To handle the output the unity guys are getting efficiently we need at least two. There seems to be a solution used in FCompositionGraphCaptureProtocol for this functionality but the USceneCaptureComponent code isn't using it.

              My plan at this stage is to build a new USceneCaptureComponent based on the code in FCompositionGraphCaptureProtocol as a plugin. I'd love to know more about what epic have planned though, I hate reinventing the wheel.

              Comment


              • #8
                Hello, any updates about mixed reality support?
                I saw this , and I'm wondering if they use built-in system

                Comment


                • #9
                  I really need of mixed-reality for a demonstration and I was a little disappointed because it's not in the 4.14 preview. But, when I tried a search to github I found this :https://github.com/degica/UnrealEngine4-MixedReality Somebody know if it works ? (My HTC Vive is currently and unfortunatly at my work...).

                  Comment


                  • #10
                    This code seems to take the same approach I used for drawing alternate render targets to the game window so you can capture them with obs. This one renders two render targets, one to the top and one to the bottom of the window. I assume these are used for foreground and background scene captures. The big drawback is the lack of alpha channel support in the render targets so the foreground can't be blended perfectly onto the camera feed. Will work though.

                    Comment


                    • #11
                      Does anyone have a status update on this? It's not in 4.14.

                      Comment


                      • #12
                        Originally posted by LokiDavison View Post
                        Does anyone have a status update on this? It's not in 4.14.
                        i'm patiently waiting
                        Spline-Enabled Fence Pack - "Great pack, huge time-saver, top quality and enough variety to really make the assets unique." -Dark Acre Jack
                        Destructible Road Signs - "Seriously awesome. Single blueprint for pretty much all the road signs you'll ever need." -thankstipscom
                        Zipline/Teleporter/JumpPad - "This is a very flexible system for any project, and really simple to implement." -lunyBunny
                        Wooden Storage Pack - "Very good high quality assets and well worth it." -Deathweave
                        Digital Portfolio

                        Comment


                        • #13
                          The Trello roadmap indicates that they will likely be working on this through December. Honestly, given the complete radio silence from the UE4 dev team, I would assume that they plan on making a really big marketing announcement about this around the holidays to help push VR.
                          Steve Biegun
                          Virtual Design Manager

                          Comment


                          • #14
                            Originally posted by SBiegun_PDG View Post
                            The Trello roadmap indicates that they will likely be working on this through December. Honestly, given the complete radio silence from the UE4 dev team, I would assume that they plan on making a really big marketing announcement about this around the holidays to help push VR.
                            I really hope so but have a bad feeling about that ...

                            Anyway lets hope it can come close to competing with this:
                            http://owlchemylabs.com/owlchemyvr-m...lity-update-2/
                            That is absolutely brilliant!
                            Headgear - VR solutions

                            VR Game Release: We Come In Peace...
                            Oculus GearVR: https://www2.oculus.com/experiences/...6630401745678/
                            Daydream VR: https://play.google.com/store/apps/details?id=com.headgear.WCIP

                            Comment


                            • #15
                              I've been mostly successful in implementing mixed reality using this Spout plugin for UE4:

                              https://github.com/AleDel/Spout-UE4

                              And TouchDesigner:

                              https://www.derivative.ca

                              I'm sending a keyed texture from TouchDesigner's SpoutOut node into a material instance dynamic in UE4. I'm then reading back a texture from a scene capture 2d (a child of a motion controller with a user defined transform offset) into a TouchDesigner SpoutIn node. The results are visually quite good.

                              The only issues I'm currently having are:

                              1. The UE4 plugin seems to have some sort of memory leak with the Spout Receiver (no issues with the Spout Sender).

                              2. I don't know how to delay the game to match the latency of the incoming keyed texture.

                              No. 2 is much easier to handle if the composite is done outside of the engine. With the texture being composited inside the engine I need some way to add a custom amount of latency to everything in the game except the texture. I have no idea how to do that.

                              Does anyone have a suggested approach to solving this latency issue?

                              I'd also come very close to implementing a composite outside of the engine with 2 scene capture 2d components, one sending the tonemapped RGB and another sending a modified depth pass to separate foreground from background. Unfortunately that method doesn't work at all with translucent materials.

                              Comment

                              Working...
                              X