Announcement

Collapse
No announcement yet.

Mixed Reality Status update

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • started a topic [VIVE] Mixed Reality Status update

    Mixed Reality Status update

    Does anyone have a status update about when mixed reality video recording will be possible with UE?

    Basically I'd like to know if there is a branch where the 2D window can be set to another camera than the VR camera. Last update I can find is this: https://twitter.com/EpicJamesG/statu...761597440?s=09

  • replied
    Originally posted by GJ View Post
    Hello,

    Since 4.20 is out, I'm trying to do a vive tracker camera calibration with the Mixed Reality Capture Calibration Tool. I can make the whole configuration EXCEPT the vive tracker as a camera.

    The documentation says that you can go trough the available tracking attachments with the Tab key, but nothing when I hit Tab, I only see " No Tracking Attachment ".

    https://docs.unrealengine.com/en-us/...alibrationTool

    Any ideas?
    I've not tried it out yet but might be better to ask in the sticky feedback topic here:
    https://forums.unrealengine.com/deve...dback-for-epic
    In case it has more visibility from Epic

    Leave a comment:


  • replied
    Hello,

    Since 4.20 is out, I'm trying to do a vive tracker camera calibration with the Mixed Reality Capture Calibration Tool. I can make the whole configuration EXCEPT the vive tracker as a camera.

    The documentation says that you can go trough the available tracking attachments with the Tab key, but nothing when I hit Tab, I only see " No Tracking Attachment ".

    https://docs.unrealengine.com/en-us/...alibrationTool

    Any ideas?

    Leave a comment:


  • replied
    Beta testing an MR Capture Plugin. Unlike the MR framework plugin supplied by UE, this is not in-engine compositing. It replicates the existing Unity method of quadrants for compatibility and frictionless play for those with existing MR setups.

    Example BP setup:
    Click image for larger version

Name:	image_138417.png
Views:	1
Size:	124.8 KB
ID:	1478889

    Sample MR output:
    Click image for larger version  Name:	VMRoutput_sample.png Views:	1 Size:	100.0 KB ID:	1478890


    If you have a fully baked experience in 4.16+, please reach out to me. *Only accepting a couple of developers for now. Apologies a head of time if I don't respond.

    FOO
    Last edited by alsoknownasfoo; 05-22-2018, 01:40 PM.

    Leave a comment:


  • replied
    I made a demo based on the discussion above.

    https://forums.unrealengine.com/deve...opment/1476572

    Leave a comment:


  • replied
    in case anyone wonders how to combine two textures into one

    https://blueprintue.com/blueprint/5x2fqktf/

    Last edited by fengkan; 05-12-2018, 06:25 AM.

    Leave a comment:


  • replied
    Hey there

    can Ben Marsh or any of the Unreal Engine Devs working on the mixed VR Framework give us some information about when we can expect a working version and actually what we can expect?
    I was quite excited when I upgraded to 4.19 and saw the mixed vr Framework Plugin. However, trying to actually use it turned out to be impossible. Not only is there absolutly 0 lines of documentation, the sample Calibration Map included in the Folder seems to be completly broken (or wrongly used by me, which I can't really tell). I partly got it working one and even then the calibration process just seemed to be unuseable for any end consumer (way too much work and inconsistency compared to some simple Mixed VR Configurators out there for Unity Games). Also it seemed to actually only support in Engine Compositing (which produces Images way too dark) and would not output the depth/ background/ foreground (either per seperated output streams or split Screen). Making it potentially more user friendly, but also very limited in use (seriously, most pcs already struggle wirh vr alone, having to composite the images only on the same machine doesn't seem viable atm). And well it crashed multiple times anyway (never got past the configuration of the controllers).

    So is there actually any progress on this feature?
    I guess having it "announced" (rather mentioned) almost 2 years ago and shown of on multiple events (including GDC) justifies some Updates or infos from time to time. It's getting really frustrating (any approach we tried in Blueprints only projects so far resulted in a huge Performance hit and suboptimal solutions). We would already be happy with a simple way to draw out the needed Layers to the screen (4-Split) and then compose it in an external Software.

    Leave a comment:


  • replied
    Awesome!! Great find until Epic is ready to provide us with their version.

    Leave a comment:


  • replied
    Originally posted by SamuelEnslin View Post
    This thread hasn´t been active in a while, so I wanted to ask if there is any usable progress right now?

    We need to buikd a test project in unreal with HTC vive, we have the two controllers and an HTC Tracker we would attach to a camera pointing at a green screen.

    But we can´t find any documentation on how to set up an unreal project to get us started...
    This is kind of urgent since we´ve optioned a green screen studio for a test, so any help would be greatly appreciated!
    I'm personally looking into this solution here: https://liv.tv/blog/mixed-reality-htc-vive-tracker

    It does seem to support only selected titles, however should there be a way to add the engine builds on the go, the tool would be really great for MR capturing.

    Update:
    Join the discord, ask for SDK and add games yourself.

    Leave a comment:


  • replied
    This thread hasn´t been active in a while, so I wanted to ask if there is any usable progress right now?

    We need to buikd a test project in unreal with HTC vive, we have the two controllers and an HTC Tracker we would attach to a camera pointing at a green screen.

    But we can´t find any documentation on how to set up an unreal project to get us started...
    This is kind of urgent since we´ve optioned a green screen studio for a test, so any help would be greatly appreciated!

    Leave a comment:


  • replied
    Originally posted by mebalzer View Post
    I don't know if anyone else has made any progress using the Mixed Reality Framework, but I have not only been able to use the Calibrator map to sync controllers to your hands in video, but also how to use any Windows compliant video sources, e.g. webcams and HDMI-USB3 encoders (for camcorders & DSLRs) to composite video & VR camera directly to the screen which can be recorded via HDMI recorder, software based video capture tool, or even as live feed for streaming or a monitor. Sadly there is one issue I have run into and that is an extremely dark image, as though the levels have been crushed dramatically. XR Framework does use a Spectator Screen object, but the one that shown in the headset is well lit. It is just the final output when you hit RUN or compile that it appears dark.

    If anyone is interested in learning more about what I have discovered I can make a tutorial video and some rough documentation. I anyone out their is already familiar with what I am talking about, any suggestions on how to improved the video output would graciously accepted.

    Here are few images and a link to sample video on YouTube: https://youtu.be/h3c4eChdhzY
    Great work! I'd really appreciate a tutorial on how you got here.

    Leave a comment:


  • replied
    I don't know if anyone else has made any progress using the Mixed Reality Framework, but I have not only been able to use the Calibrator map to sync controllers to your hands in video, but also how to use any Windows compliant video sources, e.g. webcams and HDMI-USB3 encoders (for camcorders & DSLRs) to composite video & VR camera directly to the screen which can be recorded via HDMI recorder, software based video capture tool, or even as live feed for streaming or a monitor. Sadly there is one issue I have run into and that is an extremely dark image, as though the levels have been crushed dramatically. XR Framework does use a Spectator Screen object, but the one that shown in the headset is well lit. It is just the final output when you hit RUN or compile that it appears dark.

    If anyone is interested in learning more about what I have discovered I can make a tutorial video and some rough documentation. I anyone out their is already familiar with what I am talking about, any suggestions on how to improved the video output would graciously accepted.

    Here are few images and a link to sample video on YouTube: https://youtu.be/h3c4eChdhzY

    Leave a comment:


  • replied
    I used, mixed reality sample for UE4 in :
    https://github.com/richmondx/UE4-MixedReality
    but when compile BP_VRCapture2D could not find a function named "SetCaptureRenderTarget" in SteamVRFunctionLibrary
    How can make sure"SteamVRFunctionLibrary" has been compiled for?

    Leave a comment:


  • replied
    @emretanirgan Can you please share your bluprint you used to write the material to a render target ? To me writing the material to a render target does not work for some reason.

    Leave a comment:


  • replied
    Originally posted by emretanirgan View Post
    In case anyone is wondering, with the latest 'Spectator Screen' changes it should now be possible to do proper mixed reality capture without the need for a custom plugin. I'm still playing with it and I want to write up something once I test it out, but here's how I'm doing it so far:

    - Have 2 SceneCapture2D Actors in the scene. Each one renders to a different Render Target. One will be used for capturing just foreground (what's between the headset and the SceneCapture2D), and the other one will capture the whole scene. You can use something like the green screen shader that was mentioned in this thread for the foreground capture object. And you can check out this thread to figure out how to set the transform of the SceneCapture2D Actors to be the same as the 3rd Vive controller transform. (Since Oculus also announced support for additional tracked controllers, it should also be possible to use the transform coming from a 3rd Touch).
    - Set your spectator mode to 'Texture Plus Eye', and have the eye take up one quadrant, while the texture takes up the whole view. Have the eye draw over the texture.
    - Create a PostProcess Material that combines two Render Targets into one image. I can post my material nodes once I'm done testing the whole thing, but basically set the UTiling and VTiling of your TexCoord[0] to 2, then sample the two Texture Samples and combine them so each one takes one half of the resulting image.
    - Draw this material to another render target on Tick and set that to be the Spectator Mode Texture.
    - Either capture the individual quadrants in OBS and composite them so that the real camera is sandwiched between your foreground and background, or capture the whole window, then do your editing and compositing in another software like After Effects or Premiere. I'd prefer the latter if you don't have to stream your mixed reality footage, since you get a lot more control over the final result.

    That should be it! I don't know if this is the most optimized way to do it, but I just did a simple test on my Oculus where I had two screen captures in my scene capturing every frame, and it was comfortably running at 90fps. Whereas with the plugin that was mentioned in the thread over the past couple of months, I had lower res render targets with scene captures that weren't even running every frame, and most of the time I was at 45 FPS on Vive.

    I'll try to test it out and write up something within the next few weeks if that'd be helpful. Here's an example screenshot below that I got using the steps I mentioned.

    [ATTACH=CONFIG]152267[/ATTACH]
    Can you please post the material which combines the two render targets together ?

    Leave a comment:

Working...
X