How this guy made this video?
well… is this game Unity based or Unreal Based?
Cheers.
I’m not sure about Unity, I have this game for my Vive and I assume they have made it in Unreal Engine
Arizona Sunshine is made in Unity.
Why Unreal developers can’t make this for so long? I want to be my demos also in the third view, I have Vive and Vive Tracker, I don’t mind to use GreenScreen and overlay two videos, but how to capture video from Virtual Third person who is holding Vive tracker and shooting me at the same time?
Just noticed Oculus added mixed reality UE4 support for the Rift: https://developer.oculus.com/documentation/unreal/latest/concepts/unreal-mrc/ !
Can this be ported to be used with the Vive somehow?
Unreal 4.17 now supports “Spectator Screen”.
What do we need now for a Mixed Reality Plug in?
My tests with Spectator Screen in 4.17.0
I don’t know how to make texture not to stretch or adjust to the proper dimension, I used Scene Capture 2D to generate texture and added this generated texture to the Spectator Screen, how to capture it now to the video file? In the manual there an option to set capture FPS:
https://docs.unrealengine.com/latest/INT/Platforms/VR/VRSpectatorScreen/index.html
Maybe someone knows how to make capture? On the green screen )
Vive Tracker
In case anyone is wondering, with the latest ‘Spectator Screen’ changes it should now be possible to do proper mixed reality capture without the need for a custom plugin. I’m still playing with it and I want to write up something once I test it out, but here’s how I’m doing it so far:
- Have 2 SceneCapture2D Actors in the scene. Each one renders to a different Render Target. One will be used for capturing just foreground (what’s between the headset and the SceneCapture2D), and the other one will capture the whole scene. You can use something like the green screen shader that was mentioned in this thread for the foreground capture object. And you can check out this thread to figure out how to set the transform of the SceneCapture2D Actors to be the same as the 3rd Vive controller transform. (Since Oculus also announced support for additional tracked controllers, it should also be possible to use the transform coming from a 3rd Touch).
- Set your spectator mode to ‘Texture Plus Eye’, and have the eye take up one quadrant, while the texture takes up the whole view. Have the eye draw over the texture.
- Create a PostProcess Material that combines two Render Targets into one image. I can post my material nodes once I’m done testing the whole thing, but basically set the UTiling and VTiling of your TexCoord[0] to 2, then sample the two Texture Samples and combine them so each one takes one half of the resulting image.
- Draw this material to another render target on Tick and set that to be the Spectator Mode Texture.
- Either capture the individual quadrants in OBS and composite them so that the real camera is sandwiched between your foreground and background, or capture the whole window, then do your editing and compositing in another software like After Effects or Premiere. I’d prefer the latter if you don’t have to stream your mixed reality footage, since you get a lot more control over the final result.
That should be it! I don’t know if this is the most optimized way to do it, but I just did a simple test on my Oculus where I had two screen captures in my scene capturing every frame, and it was comfortably running at 90fps. Whereas with the plugin that was mentioned in the thread over the past couple of months, I had lower res render targets with scene captures that weren’t even running every frame, and most of the time I was at 45 FPS on Vive.
I’ll try to test it out and write up something within the next few weeks if that’d be helpful. Here’s an example screenshot below that I got using the steps I mentioned.
Even better than a write up would be an example project with all the cameras and rendertargets you talk about already setup if you think that is easy for you? - always easier to work it out with an example and many thanks for taking the time to share your findings with the community!
Anyone tried the mixed reality plugin thats in beta?
Can you please link me?
Between the composure plugin, spectator screen, and maybe this mixed reality framework plugin, I’m hoping for the mixed reality workflow to pop up here within the month.
I decided to fool around with the Mixed Reality Framework plugin yesterday and got some good results. Now mind you it did crash a few times but it is a beta plugin so that was to be expected. After enabling the plugin (using 4.17.1) and the ability to view plugin content in your editor (if you don’t know how, click the view options dropdown from the bottom right of the content browser and check the box) browse to the MixedRealityFramework Content in the sources panel (in content browser, underneath Add New dropdown and to the left of Filters dropdown click that icon) you will see the following contents: the circled are what you should see…
If you open up the M_MRVidProcessing material, break the link between the Texture Object and Sample, so it won’t override the video we send it.
Now make a Media File Source and grab some green screen test footage from youtube or w/e and download it because for now that seems to be the only way to do this as streaming via URL does not work and I have no idea where to live stream to and stream that back in, AND it be compatible. I tried youtube live and it didn’t work. Also tried a few other sites for just streaming into the project and none worked so someone with more knowledge is needed there…
So after you have your downloaded video…open up your Media File Source and set the path to the video.
Now open the MRVideoSource Media Player and it should be visible under the large black window. Make sure Play on Open is checked (should be by default) and for testing I’d check loop also.
Now we just need to get a MixedRealityCapture component into the scene so you can either make a BP_Actor and add the component or create one in the** level BP**. In the end we are gonna send this to a spectator screen so at this point if you make the** BP_Actor** and drag it into the scene and in its details, click on the** MixedRealityCapture** component, scroll down to video capture and ensure Media Source = MRVideoSource, Video Processing Material = M_MRVidProcessing, and you can scroll down further to scene capture and for Texture Target make a RenderTarget2D and set the Composite Mode to Composite.
If you create one in the** level BP** most things will be set by default** except for the texture target** which you’ll have to set to the newly created rendertarget2D we just made.
Now open the level BP….
Make a variable call it** MR_Player** or w/e and set it as type Media Player, and do event begin play > open source (media player target), target is our MR_Player and set source as the Media File Source we created earlier. From the open source > Set Spectator Screen Mode(Mode is Texture) > Set Spectator Screen Texture(In Texture is the RenderTarget2d we made) Might need to uncheck context sensitive and search for some of these I don’t recall…but that should make it work when you launch VR preview IF you made the BP_Actor……otherwise you’ll have to wire in the Add Mixed Reality Capture Component node before the open source node once again make sure in details that the Video Capture and Scene Capture have the right settings as earlier.
Hopefully you too can get JCVD into your VR scene!
Now people smarter than me can hopefully get the input live video stream going, along with adapting that post processing/shader to have objects go in front or stay behind of the video capture as needed. Hope this helps us get there as a team!..or EPIC just finishes it……lol……anyways hope this helps and sorry if its not the best of a guide. And for the record I attached it to my controller and without going into more BP and having its position be the same as the controller, you will get rotations from the controller but you will have a location offset. BUT the video ends up flickering and unusable but that could just be my machine which is from 2012 with a gen 2 i7 2600K and a borrowed older Titan card. Not the new 10 series chip or w/e.
As of the last live stream (Aug 24 2017),one of the composure compositor plugin’s final goals is for this capability according to Sam Dieter (question/answer section at end of cast) which leaves this in a funny place in regards to the Mixed Reality Framework plugin that is in beta. So I guess we will see what happens.
Can you share a project or blueprints please?
I made a mixed reality video of my wheelchair controller prototype! I have been waiting for so long on this awesome feature!
VERY AWESOME! What method did you use to get the mixed reality going?