Mixed Reality Status update

I decided to fool around with the Mixed Reality Framework plugin yesterday and got some good results. Now mind you it did crash a few times but it is a beta plugin so that was to be expected. After enabling the plugin (using 4.17.1) and the ability to view plugin content in your editor (if you don’t know how, click the view options dropdown from the bottom right of the content browser and check the box) browse to the MixedRealityFramework Content in the sources panel (in content browser, underneath Add New dropdown and to the left of Filters dropdown click that icon) you will see the following contents: the circled are what you should see…

If you open up the M_MRVidProcessing material, break the link between the Texture Object and Sample, so it won’t override the video we send it.

Now make a Media File Source and grab some green screen test footage from youtube or w/e and download it because for now that seems to be the only way to do this as streaming via URL does not work and I have no idea where to live stream to and stream that back in, AND it be compatible. I tried youtube live and it didn’t work. Also tried a few other sites for just streaming into the project and none worked so someone with more knowledge is needed there…
So after you have your downloaded video…open up your Media File Source and set the path to the video.
Now open the MRVideoSource Media Player and it should be visible under the large black window. Make sure Play on Open is checked (should be by default) and for testing I’d check loop also.
Now we just need to get a MixedRealityCapture component into the scene so you can either make a BP_Actor and add the component or create one in the** level BP**. In the end we are gonna send this to a spectator screen so at this point if you make the** BP_Actor** and drag it into the scene and in its details, click on the** MixedRealityCapture** component, scroll down to video capture and ensure Media Source = MRVideoSource, Video Processing Material = M_MRVidProcessing, and you can scroll down further to scene capture and for Texture Target make a RenderTarget2D and set the Composite Mode to Composite.
If you create one in the** level BP** most things will be set by default** except for the texture target** which you’ll have to set to the newly created rendertarget2D we just made.
Now open the level BP….

Make a variable call it** MR_Player** or w/e and set it as type Media Player, and do event begin play > open source (media player target), target is our MR_Player and set source as the Media File Source we created earlier. From the open source > Set Spectator Screen Mode(Mode is Texture) > Set Spectator Screen Texture(In Texture is the RenderTarget2d we made) Might need to uncheck context sensitive and search for some of these I don’t recall…but that should make it work when you launch VR preview IF you made the BP_Actor……otherwise you’ll have to wire in the Add Mixed Reality Capture Component node before the open source node once again make sure in details that the Video Capture and Scene Capture have the right settings as earlier.

Hopefully you too can get JCVD into your VR scene!

Now people smarter than me can hopefully get the input live video stream going, along with adapting that post processing/shader to have objects go in front or stay behind of the video capture as needed. Hope this helps us get there as a team!..or EPIC just finishes it……lol……anyways hope this helps and sorry if its not the best of a guide. And for the record I attached it to my controller and without going into more BP and having its position be the same as the controller, you will get rotations from the controller but you will have a location offset. BUT the video ends up flickering and unusable but that could just be my machine which is from 2012 with a gen 2 i7 2600K and a borrowed older Titan card. Not the new 10 series chip or w/e.