Blending two cameras rendering in a PostProcess ? => Quest Guardian effect look alike

Hello,

In the same scene, I have my “main” world where we can move / look around and a closed room (not visible by the player).

During the game, the rendering of the closed room must fade on the entire screen with the current view of the world.
We would like to mimic the Guardian effect from the Quest when we exit the limits.

Inside the closed room, I’ve placed a SceneCameraComponent2D.
In a PostProcess material, I progressively blend the SceneTexture.PostProcessInput0 and the RenderTexture captured by the SceneCam2D.

It works on the editor of course, but it won’t for a stereoscopic rendering…
I did a lot of research: stereo layer, VR spectator, or any available feature / function from Unreal Blueprint…

Some screenshots with the blending opacity between the two subscenes:

As we want to re-make the guardian effect, we must blend two buffers.
Which solution would be possible ?

Thanks a lot in advance :pray: :wink:

Ps: No graphics limitation as the PC streams directly to the Quest

I would place the room where it’s supposed to be, and use a Master material with an exposed opacity parameter that can be controlled for everything in that room

@VictorLerp
Using an opacity parameter means that all the materials of the room are Translucent.
We would lose shadows, a good lighting, and some post-effects.

We must keep our opaque materials for the room.

I’m not aware of any other native solution, as PostProcess materials are rendered in ScreenSpace

@VictorLerp :confused:
Adding a second VR Pawn in my room, would it be possible to get its FinalColor buffer and send it or access to it in a PostProcess ?
As we can get the finalColor of the main VR Pawn inside the PostProcessInput0 ?

Or it’s only possible by coding a C++ plugin ?