Announcement

Collapse
No announcement yet.

Mixed Reality Status update

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • I attached the spectator camera to a Vive controller. Then I held my smartphone just above the Vive controller and mixed the footage in adobe after effects. I'm making a better video today!

    Comment


    • I am not sure about the difference between using "Spectator Screen" and "Mixed Reality Framework plugin", can anyone tell me about this?

      Comment


      • We have created a mixed reality video for our trailer for "VR Shooter Guns". We use a custom 4.14 engine build and a Vive tracker. However, we put the footage together afterwards (not in real time):

        https://www.youtube.com/embed/OvHBCzdlIT8

        VR Shooter Guns - Arcade Shooter for Vive
        Magic Tavern - A Magic Shooting Gallery for Virtual Reality
        Unreal Meetup Franken - Unreal Engine 4 Meetup
        Hands for VR: Basic - For Vive and Oculus [Marketplace]
        Hands for VR: SciFi - For HTC Vive and Oculus Touch [Marketplace]

        Comment


        • I posted this on a thread about spectator mode:

          We had our own code to output a spectator cam to the main window a year ago and we used OBS to capture the window and combine it realtime with a green screen camera feed. If you do want a realtime result you need to run the game output through a capture card to enable OBS to delay the footage enough to sync with the camera footage. A one system hack way is to use xsplit to rebroadcast the game capture as a directshow device which OBS can then delay like it does with capture cards. For best results you need to render a foreground only view as well to composite on top of your green screen footage. The ideal setup is two machines and two low latency HD or 4k capture cards, one machine to run the VR and spectator views and one to capture and composite the result. That way you have enough guts on the capture machine to composite the video realtime on set and store the raw streams for hi quality compositing in post.
          I would stick to OBS based mixed reality because you can sync the video and game footage. Although having streamed video code in engine is definitely a good thing.

          All this reminds me I have some improvements to the sceneCaptureComponent that allow you to capture multiple textures off a single render. It enabled us to get the foreground and background textures without the extra over head of two scene captures. I need to update it to the latest engine and submit it.

          Comment


          • Originally posted by Joti View Post
            I posted this on a thread about spectator mode:



            I would stick to OBS based mixed reality because you can sync the video and game footage. Although having streamed video code in engine is definitely a good thing.

            All this reminds me I have some improvements to the sceneCaptureComponent that allow you to capture multiple textures off a single render. It enabled us to get the foreground and background textures without the extra over head of two scene captures. I need to update it to the latest engine and submit it.
            Have you managed to submit your improvements to sceneCaptureComponent ?

            Comment


            • Originally posted by emretanirgan View Post
              In case anyone is wondering, with the latest 'Spectator Screen' changes it should now be possible to do proper mixed reality capture without the need for a custom plugin. I'm still playing with it and I want to write up something once I test it out, but here's how I'm doing it so far:

              - Have 2 SceneCapture2D Actors in the scene. Each one renders to a different Render Target. One will be used for capturing just foreground (what's between the headset and the SceneCapture2D), and the other one will capture the whole scene. You can use something like the green screen shader that was mentioned in this thread for the foreground capture object. And you can check out this thread to figure out how to set the transform of the SceneCapture2D Actors to be the same as the 3rd Vive controller transform. (Since Oculus also announced support for additional tracked controllers, it should also be possible to use the transform coming from a 3rd Touch).
              - Set your spectator mode to 'Texture Plus Eye', and have the eye take up one quadrant, while the texture takes up the whole view. Have the eye draw over the texture.
              - Create a PostProcess Material that combines two Render Targets into one image. I can post my material nodes once I'm done testing the whole thing, but basically set the UTiling and VTiling of your TexCoord[0] to 2, then sample the two Texture Samples and combine them so each one takes one half of the resulting image.
              - Draw this material to another render target on Tick and set that to be the Spectator Mode Texture.
              - Either capture the individual quadrants in OBS and composite them so that the real camera is sandwiched between your foreground and background, or capture the whole window, then do your editing and compositing in another software like After Effects or Premiere. I'd prefer the latter if you don't have to stream your mixed reality footage, since you get a lot more control over the final result.

              That should be it! I don't know if this is the most optimized way to do it, but I just did a simple test on my Oculus where I had two screen captures in my scene capturing every frame, and it was comfortably running at 90fps. Whereas with the plugin that was mentioned in the thread over the past couple of months, I had lower res render targets with scene captures that weren't even running every frame, and most of the time I was at 45 FPS on Vive.

              I'll try to test it out and write up something within the next few weeks if that'd be helpful. Here's an example screenshot below that I got using the steps I mentioned.

              [ATTACH=CONFIG]152267[/ATTACH]
              Can you please post the material which combines the two render targets together ?

              Comment


              • @emretanirgan Can you please share your bluprint you used to write the material to a render target ? To me writing the material to a render target does not work for some reason.

                Comment


                • I used, mixed reality sample for UE4 in :
                  https://github.com/richmondx/UE4-MixedReality
                  but when compile BP_VRCapture2D could not find a function named "SetCaptureRenderTarget" in SteamVRFunctionLibrary
                  How can make sure"SteamVRFunctionLibrary" has been compiled for?

                  Comment


                  • I don't know if anyone else has made any progress using the Mixed Reality Framework, but I have not only been able to use the Calibrator map to sync controllers to your hands in video, but also how to use any Windows compliant video sources, e.g. webcams and HDMI-USB3 encoders (for camcorders & DSLRs) to composite video & VR camera directly to the screen which can be recorded via HDMI recorder, software based video capture tool, or even as live feed for streaming or a monitor. Sadly there is one issue I have run into and that is an extremely dark image, as though the levels have been crushed dramatically. XR Framework does use a Spectator Screen object, but the one that shown in the headset is well lit. It is just the final output when you hit RUN or compile that it appears dark.

                    If anyone is interested in learning more about what I have discovered I can make a tutorial video and some rough documentation. I anyone out their is already familiar with what I am talking about, any suggestions on how to improved the video output would graciously accepted.

                    Here are few images and a link to sample video on YouTube: https://youtu.be/h3c4eChdhzY

                    Comment

                    Working...
                    X