Mixed Reality Status update

I attached the spectator camera to a Vive controller. Then I held my smartphone just above the Vive controller and mixed the footage in adobe after effects. I’m making a better video today!

I am not sure about the difference between using “Spectator Screen” and “Mixed Reality Framework plugin”, can anyone tell me about this?

We have created a mixed reality video for our trailer for “VR Shooter Guns”. We use a custom 4.14 engine build and a Vive tracker. However, we put the footage together afterwards (not in real time):

[video]YouTube

I posted this on a thread about spectator mode:

I would stick to OBS based mixed reality because you can sync the video and game footage. Although having streamed video code in engine is definitely a good thing.

All this reminds me I have some improvements to the sceneCaptureComponent that allow you to capture multiple textures off a single render. It enabled us to get the foreground and background textures without the extra over head of two scene captures. I need to update it to the latest engine and submit it.

Have you managed to submit your improvements to sceneCaptureComponent ?

Can you please post the material which combines the two render targets together ?

@emretanirgan Can you please share your bluprint you used to write the material to a render target ? To me writing the material to a render target does not work for some reason.

I used, mixed reality sample for UE4 in :
https://github.com/richmondx/UE4-
but when compile BP_VRCapture2D could not find a function named “SetCaptureRenderTarget” in SteamVRFunctionLibrary
How can make sure"SteamVRFunctionLibrary" has been compiled for?

I don’t know if anyone else has made any progress using the Mixed Reality Framework, but I have not only been able to use the Calibrator map to sync controllers to your hands in video, but also how to use any Windows compliant video sources, e.g. webcams and HDMI-USB3 encoders (for camcorders & DSLRs) to composite video & VR camera directly to the screen which can be recorded via HDMI recorder, software based video capture tool, or even as live feed for streaming or a monitor. Sadly there is one issue I have run into and that is an extremely dark image, as though the levels have been crushed dramatically. XR Framework does use a Spectator Screen object, but the one that shown in the headset is well lit. It is just the final output when you hit RUN or compile that it appears dark.

If anyone is interested in learning more about what I have discovered I can make a tutorial video and some rough documentation. I anyone out their is already familiar with what I am talking about, any suggestions on how to improved the video output would graciously accepted.

Here are few images and a link to sample video on YouTube: A sample of using Unreal Engine's 4.18 Mixed (video) Reality Framework - YouTube

Great work! I’d really appreciate a tutorial on how you got here.

This thread hasn´t been active in a while, so I wanted to ask if there is any usable progress right now?

We need to buikd a test project in unreal with HTC vive, we have the two controllers and an HTC Tracker we would attach to a camera pointing at a green screen.

But we can´t find any documentation on how to set up an unreal project to get us started…
This is kind of urgent since we´ve optioned a green screen studio for a test, so any help would be greatly appreciated!

I’m personally looking into this solution here: https://liv.tv/blog/mixed-reality-htc-vive-tracker

It does seem to support only selected titles, however should there be a way to add the engine builds on the go, the tool would be really great for MR capturing.

Update:
Join the discord, ask for SDK and add games yourself.

Awesome!! Great find until Epic is ready to provide us with their version.

Hey there

can [USER=“1345”]Ben Marsh[/USER] or any of the Unreal Engine Devs working on the mixed VR Framework give us some information about when we can expect a working version and actually what we can expect?
I was quite excited when I upgraded to 4.19 and saw the mixed vr Framework Plugin. However, trying to actually use it turned out to be impossible. Not only is there absolutly 0 lines of documentation, the sample Calibration Map included in the Folder seems to be completly broken (or wrongly used by me, which I can’t really tell). I partly got it working one and even then the calibration process just seemed to be unuseable for any end consumer (way too much work and inconsistency compared to some simple Mixed VR Configurators out there for Unity Games). Also it seemed to actually only support in Engine Compositing (which produces Images way too dark) and would not output the depth/ background/ foreground (either per seperated output streams or split Screen). Making it potentially more user friendly, but also very limited in use (seriously, most pcs already struggle wirh vr alone, having to composite the images only on the same machine doesn’t seem viable atm). And well it crashed multiple times anyway (never got past the configuration of the controllers).

So is there actually any progress on this feature?
I guess having it “announced” (rather mentioned) almost 2 years ago and shown of on multiple events (including GDC) justifies some Updates or infos from time to time. It’s getting really frustrating (any approach we tried in Blueprints only projects so far resulted in a huge Performance hit and suboptimal solutions). We would already be happy with a simple way to draw out the needed Layers to the screen (4-Split) and then compose it in an external Software.

in case anyone wonders how to combine two textures into one

I made a demo based on the discussion above.

https://forums.unrealengine.com/development-discussion/vr-ar-development/1476572

Beta testing an MR Capture Plugin. Unlike the MR framework plugin supplied by UE, this is not in-engine compositing. It replicates the existing Unity method of quadrants for compatibility and frictionless play for those with existing MR setups.

Example BP setup:

Sample MR output:
VMRoutput_sample.png

If you have a fully baked experience in 4.16+, please reach out to me. *Only accepting a couple of developers for now. Apologies a head of time if I don’t respond.

FOO

Hello,

Since 4.20 is out, I’m trying to do a vive tracker camera calibration with the Mixed Reality Capture Calibration Tool. I can make the whole configuration EXCEPT the vive tracker as a camera.

The documentation says that you can go trough the available tracking attachments with the Tab key, but nothing when I hit Tab, I only see " No Tracking Attachment ".

https://docs.unrealengine.com/en-us/Platforms/MR/HowToCaptureCalibrationTool

Any ideas?

I’ve not tried it out yet but might be better to ask in the sticky feedback topic here:
https://forums.unrealengine.com/development-discussion/vr-ar-development/1512350-mixed-reality-capture-feedback-for-epic
In case it has more visibility from Epic :wink: