Is this possible to do? Oculus Rift translation masking on different maps?

I’m not sure if this is where I should be making this topic, but here goes.

I have a clear vision of what I want to achieve, but since I’m an artist and musician mainly I don’t even have the mental capacity to fathom how I would start to build something like this. I’ve been headbutting against blueprint and various resources I can find but I haven’t been able to achieve anything.

I have made a flash animation demonstration of what I would like to achieve, please let me know if this doesn’t work for you. It’s a bit glitchy with the looping, but you should get the idea:

https://dl.dropboxusercontent.com/u/37923267/IdeaDemo.html < Please click this link to see the animated version of the image.

In a nutshell, I want to make a seated Oculus Rift experience where you’re just sitting at a neutral position experiencing one “reality”. When you lean forward, you will gradually enter a second reality, as if you’re pushing your head through a portal. The closest game related implementation I can remember is the effect from Wolfenstein 2009, maybe Bioshock Infinite very slightly.

Likewise, the same thing will happen if you lean back in your chair or position, you’ll enter an entirely differnet one. Think moving your head through the surface of the water to look underneath, and likewise, being underwater and pushing your head through to the surface.

With this, comes vastly different music and visual elements to appear or be mixed in. I would potentially like to create 3 entirely different visual levels, but with a common structure and sync to them all. What was a tower in one world is a tall tree in another, for example.

I’m not after being told how exactly this could be done, I would just like to know if it’s even possible and what kind of things I should look into learning to achieve this.
Just places to start, basically. I can figure out the sound cue implementation I’m sure, and asset creation is just fine and dandy, I just don’t know how I’d even begin to go about the way this works mechanically and how to do it. I’m used to doing everything myself so blueprints sounded like a great thing for me, but alas, I’m still not getting anywhere.

tl;dr, Is it possible to make 3 whole different maps rendered at the same time but masked, basically.

I have a UE4 sub and I’m very serious about this, I’m willing to actually start a recruitment thread and pay some bright person to help me with the core systems, if there is even a possibility for this kind of thing to work.

Again, sorry if I have put this in the wrong forum section. I’ve been writing music and brainstorming for this for a long, long time and I want to start getting serious about making it actually happen.

I’ve been experimenting with UE’s post processing recently for the leap motion and I know of at least one way you could achieve this.

The UE engine is fully deferred, which means that you have access to a lot of information in the post process (depth map, normals, specular, custom depth etc). For example using the depth map, I can blend in the virtual world selectively based on geometry while retaining a passthrough camera for everything else. In this example I replace the roof of the real world with a virtual sky.

So one way for you to achieve different worlds would be to render objects/effects into a custom depth map (see Post Process Materials | Unreal Engine Documentation) and use a material shader conditional (when you lean forward you increase a circular mask from the center of your viewpoint) to mask in the custom depth pixels and special effects.

Additionally if you want a different viewport you can combine this with a render-to-texture SceneCapture 2D object to render the different view to a texture and then blend those pixels on top using your mask.

While this isn’t exactly a difficult to grasp feature, it can get complicated when you consider that the rift renders two views and you have to be able to center your masks around the respective L-R image centers for the effect to look correct when using the rift.

If you don’t want to deal with 2 separate images, you can use a simpler way: have the actors and/or their materials change as you lean in past a certain point. You would need a way to track each actor, switch them from hidden to visible and blend them in using a translucent material and then perhaps swapping their materials to fully opaque ones once you’ve leaned in fully. The challenge there would be to have a decently optimized system that would be managing all of the material/mesh swapping and you would be limited to one type of view.

Thank you for replying! I definately have some more ideas on where to start now. The depth mask stuff looks interesting.

What I was kind of thinking, is could there be a way to tie a global transparent masking shader onto every objects material, where it’s tied to your view? So as you move forward, some sort of mask expands out from your centre/outside your view and as you do this, the transparent animation material effect increases all it’s values until all the objects just kind of “dissolve” out of place? Then, somehow make this dissolving effect only visible where the mask is?

I might have explained this really badly, as I don’t fully understand how stuff works and how to get it done but I can kind of assemble some kind of pseudo logic to a possible solution.

Materials can expose parameters that can be changed in blueprint. The parameters are typically bound using names so if you have the same parameter name in each material you can have different materials change using the same type of function call. Its a question of how you would structure your blueprints so that every material in the level would be affected by the same parameter. One way would be to use a blueprint function library where you would have a function to set the global variable and then another to read the variable which can be called from every blueprint. There may be a smarter way to do this, haven’t put too much thought into it.

Do note that one limitation of UE’s renderer is that you cannot have a fully opaque transparent material. If you want to mix opaque and see-through pixels you use a masked material (1.8 - Opacity Mask | Unreal Engine Documentation). There are some devs that have used this to great effect in making walls appear to be generated from a 3d point (What Are You Working On? Community Screenshots & Videos - Work in Progress - Unreal Engine Forums). Another way you could use transparency is to have a transparent material initially and when it has transitioned to ‘fully opaque’, swap the material on the fly to an opaque one(haven’t tried this, I wonder if it would be noticeable).

That example forum post you linked is like near exactly what I’m after being able to do. I still feel a bit out of my depth, but I’m going to try and see if I can manage to learn how. Thanks for the links and insight!