Experimetal stereo camera rig for portals/transport effects in UE4

I thought I would share my experiments in getting a secondary stereo view for things such as portals and transport effects.

Im also posting this in the hope that Oculus/Epic can help me take this further or look into adding this into the engine there self
as this would be a really handy thing to have(I believe that Unity already has its own stereo rig).

My own reason for needing this is to implement a cool transporter effect where the view from the current location to the new location is masked in and then the player is moved to the current location on the standard oculus camera is then masked back.

Like so,

To get this far I needed to do a few things.
Firstly I needed to customize the CaptureComponent to give me the option of using the HMD’s left/right eye projection matrices.

So I added an Enum to SceneCaptureComponnent.h
Along with a editable variable.

#pragma once
#include "Runtime/Engine/Public/ShowFlags.h"
#include "SceneCaptureComponent.generated.h"

struct FEngineShowFlagsSetting

	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = SceneCapture)
	FString ShowFlagName;

	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = SceneCapture)
	bool Enabled;

enum class EMyEye : uint8
	EYE_Off		UMETA(DisplayName = "Off"),
	EYE_Left 	UMETA(DisplayName = "Left Eye"),
	EYE_Right 	UMETA(DisplayName = "Right Eye")

	// -> will be exported to EngineDecalClasses.h
UCLASS(hidecategories=(abstract, Collision, Object, Physics, SceneComponent, Mobility), MinimalAPI)
class USceneCaptureComponent : public USceneComponent

	UPROPERTY(EditAnywhere, Category = SceneCapture, meta = (DisplayName = "StereoEye"))
	EMyEye StereoEye;

Then some changes to FScene::CreateSceneRenderer within SceneCaptureRendering.cpp.
This checks StereoEye within the SceneCapture component and provides the HMD left/right eyes perspective projection or the default projection if EMyEye::Off.

FSceneRenderer* FScene::CreateSceneRenderer( USceneCaptureComponent* SceneCaptureComponent, UTextureRenderTarget* TextureTarget, const FMatrix& ViewMatrix, const FVector& ViewLocation, float FOV, float MaxViewDistance, bool bCaptureSceneColour, FPostProcessSettings* PostProcessSettings, float PostProcessBlendWeight )
	FIntPoint CaptureSize(TextureTarget->GetSurfaceWidth(), TextureTarget->GetSurfaceHeight());

	FTextureRenderTargetResource* Resource = TextureTarget->GameThread_GetRenderTargetResource();
	FSceneViewFamilyContext ViewFamily(FSceneViewFamily::ConstructionValues(

	FSceneViewInitOptions ViewInitOptions;
	ViewInitOptions.SetViewRectangle(FIntRect(0, 0, CaptureSize.X, CaptureSize.Y));
	ViewInitOptions.ViewFamily = &ViewFamily;
	ViewInitOptions.ViewMatrix = ViewMatrix;
	ViewInitOptions.BackgroundColor = FLinearColor::Black;
	ViewInitOptions.OverrideFarClippingPlaneDistance = MaxViewDistance;
	ViewInitOptions.SceneViewStateInterface = SceneCaptureComponent->GetViewState();
	if (bCaptureSceneColour)
		ViewFamily.EngineShowFlags.PostProcessing = 0;
		ViewInitOptions.OverlayColor = FLinearColor::Black;

	// Build projection matrix
		float XAxisMultiplier;
		float YAxisMultiplier;

		if (CaptureSize.X > CaptureSize.Y)
			// if the viewport is wider than it is tall
			XAxisMultiplier = 1.0f;
			YAxisMultiplier = CaptureSize.X / (float)CaptureSize.Y;
			// if the viewport is taller than it is wide
			XAxisMultiplier = CaptureSize.Y / (float)CaptureSize.X;
			YAxisMultiplier = 1.0f;
		if (SceneCaptureComponent->StereoEye != EMyEye::EYE_Off && GEngine->HMDDevice.Get())
			if (GEngine->HMDDevice.Get()->IsStereoEnabled())
				//UE_LOG(LogTemp, Warning, TEXT("LEFT MATRIX"));
				if (SceneCaptureComponent->StereoEye == EMyEye::EYE_Left)
					ViewInitOptions.ProjectionMatrix = GEngine->HMDDevice.Get()->GetStereoProjectionMatrix(EStereoscopicPass::eSSP_LEFT_EYE, 90/*FOV CURENLTY UNUSED BY ALL IMPLEMENTATIONS*/);
					ViewInitOptions.StereoPass = EStereoscopicPass::eSSP_LEFT_EYE; //IS THIS NEEDED?
				else if (SceneCaptureComponent->StereoEye == EMyEye::EYE_Right)
					ViewInitOptions.ProjectionMatrix = GEngine->HMDDevice.Get()->GetStereoProjectionMatrix(EStereoscopicPass::eSSP_RIGHT_EYE, 90/*FOV CURENLTY UNUSED BY ALL IMPLEMENTATIONS*/);
					ViewInitOptions.StereoPass = EStereoscopicPass::eSSP_RIGHT_EYE; //IS THIS NEEDED?
		else //END OPAMP
			ViewInitOptions.ProjectionMatrix = FReversedZPerspectiveMatrix (
	FSceneView* View = new FSceneView(ViewInitOptions);
	View->bIsSceneCapture = true;

	for (auto It = SceneCaptureComponent->HiddenComponents.CreateConstIterator(); It; ++It)
		// If the primitive component was destroyed, the weak pointer will return NULL.
		UPrimitiveComponent* PrimitiveComponent = It->Get();
		if (PrimitiveComponent)


	View->OverridePostProcessSettings(*PostProcessSettings, PostProcessBlendWeight);

	return FSceneRenderer::CreateSceneRenderer(&ViewFamily, NULL);

Next was to create a blueprint with a couple of scenecapture2D components.
And set them to left eye and right eye. Rendering to a couple of 960x1080 render targets.
(ignore the nearclip/farclip/disabletonemapper variables as they are in relation to something else I was working on).

Within the blueprint we need to set the correct IPD for the capture components.


And for testing purposes make it match the hmd position/orientation.

And a postprocess attached to the camera component of my pawn.


Now this works fine with OVR 4.4.0 (the images should match up).

But porting this to OVR caused some issues as they were not lining up.
After a while I realised it is due to the 12 pixel padding that the oculus eyetextures now require.
As a tempory measure setting the padding to 0 pixels in OculusRiftHMD.cpp fixes the issue.
But would need to look at adding the padding and/or possibly changing the horizontal FOV of the projection matrix to compensate.

class FSettings : public FHMDSettings
	const int TexturePaddingPerEye = 0; //padding, in pixels, per eye (total padding will be doubled) //OPAMP was originally 12

I wont go into anymore detail about how the transport effect was done as this is getting a bit long and I really want to talk about the limitations of this currently ugly hack
and the possibility of Oculus/Epic helping me out with this or providing a proper stereo rig to uses for such effects as transportation and portals.

The main limitation of this hack is the lack of timewarp/latency/prediction. For a relativly short effect like my transporter its not too much of an issue(although its still an issue).

Check out the lag with rapid head movments(video glitches from 0.10-0.15s which is unrelated).

So Im hoping somebody can suggest a way to fix this or a better way of going about.
Or maybe Oculus/Epic would like to save me the trouble of fumbling though there code and doing it for me? :wink:

**** Sir! Thank you so much for sharing this. I hope this post gets all the attention it deserves. A friend and I were working on a VR experience months ago were not able to achieve what you were able to. Kudos to you.

Man, I really hope they’ll help you out. This looks cool as hell. :slight_smile:

Thanks guys,

I managed to sort this out a while ago. I now have nice low latency headtracking for the second rig!
I just havent had a chance to refine it.

It’s still in the category of ugly hacks. :slight_smile:

My thoughts on the subject was to KISS. Encapsulate the player in a sphere with inverted normals… basically a hollowed out sphere, and then move both the player and the sphere to the new location. Dissipate the sphere, and there you go. It’s quick and dirty, however, it should work.

The only problem with this method is that if it is in a multi-player environment, then other players would see the transition sphere. So, perhaps there would be a way to make the sphere/player drop from replication, and after the sphere is dissolved, make the player replicateable again?

I was thinking of doing something similar for low-end systems as the addition stereo render obviously comes at a cost.

That’s classy. Maybe you could figure out a way to have the sphere not be visible to anyone but that player?

You just don’t make it replicateable, so basically, it’s only on the client side rather than on the server.

Also, with the normals being inverted, the sphere wouldn’t be visible to people outside of it anyways :wink: However, it could cause issues. I’d rather not tempt the server gnomes.

Just wanted to say that this is a very awesome hack. Have you worked on this anymore? have you managed to figure out a way around the head movement delay for the setup?

VR still needs some awesome portals :slight_smile:

Yep I figured that out a while back. It does involve adding a 5 lines of code to the oculus plugin unfortunately. So between that and that fact that you cant subclass scenecapturecomponent2D it’s still in the realms of ugly hack. If you’ve got a github branch I can push a pull request your way if you want to play with it.

What are those 5 lines of code, care to share?

I’ll have to dig out the code and put it all up on my github when I get a chance.

The only problem being is I have an open pull request with Epic ATM. I know if I push anything to my 4.10 branch then that will get added to my open pull request. Does anybody know if I can push to a different branch on my github and not effect the PR that I have with Epic?

Im a bit of a git noob. ]-)

There we go. I mannaged to put the commit up on the 4.9 branch of my fork.

Cheers for that share. Will need to find some time to try these tricks out!

Not a problem. Any questions just give us a shout.

Thanks for this guys. Really awesome stuff. Hopefully this will make it’s way into an official future release of the engine.

I doubt it will.
It’s Hackier than Mr hacky from hacksville who could’nt come in to work today at the hacksaw factory because someone hacked his leg’s off. :slight_smile:

Plus the use cases are limited to having half your frame time spare. Rendering 4 view’s is obviosly expensive.

What may have some value in there is the renderthread delegate I added to the HMD inteface that broadcast’s the final eye poses.
If we had that then things like Getnamo’s leap plugin could use that to remove any lag caused by only having access to HMD location on the game thread.

Not familiar with the inner working, but could you perhaps use a stencil mask (kind of like the one used in 4.11 to not render the edge pixels, not sure if it is actuial stencils or something else) to render only the area visible from both pairs of views, so at least the pixel shader cost would be about the same as rendering just the regular 2 views? And: this is pretty neat :slight_smile:

That’s definatly a good idea. I might revisit this when im done with the plugin.

This is really impressive! Stuff like this gives me hope that eventually UE4 will have something like the old UTPortals feature… I’ve got so many ideas, but they all require it, and I’ve been unable to find something that works… Think your changes might be useful for that?

I’ve yet to do any code-work in UE4 myself, so I’m not exactly sure how to go about trying/testing your teleport concept…Tho I do have a DK2.
Any chance eventually someone could put something like this into a template format, it could be great as a template for people to build VR stuff off of! :smiley:

EDIT 3/8/16:
Just a little note, some of the info in the op partly helped me figure out how to create this effect:

Which was for my little VR demo:
Tho I did it entirely with Blueprints/materials, cause I like avoiding coding… heh