I thought I would share my experiments in getting a secondary stereo view for things such as portals and transport effects.
Im also posting this in the hope that Oculus/Epic can help me take this further or look into adding this into the engine there self
as this would be a really handy thing to have(I believe that Unity already has its own stereo rig).
My own reason for needing this is to implement a cool transporter effect where the view from the current location to the new location is masked in and then the player is moved to the current location on the standard oculus camera is then masked back.
Like so,
To get this far I needed to do a few things.
Firstly I needed to customize the CaptureComponent to give me the option of using the HMD’s left/right eye projection matrices.
So I added an Enum to SceneCaptureComponnent.h
Along with a editable variable.
#pragma once
#include "Runtime/Engine/Public/ShowFlags.h"
#include "SceneCaptureComponent.generated.h"
USTRUCT()
struct FEngineShowFlagsSetting
{
GENERATED_USTRUCT_BODY()
UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = SceneCapture)
FString ShowFlagName;
UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = SceneCapture)
bool Enabled;
};
//OPAMP
UENUM(BlueprintType)
enum class EMyEye : uint8
{
EYE_Off UMETA(DisplayName = "Off"),
EYE_Left UMETA(DisplayName = "Left Eye"),
EYE_Right UMETA(DisplayName = "Right Eye")
};
//OPAMP END
// -> will be exported to EngineDecalClasses.h
UCLASS(hidecategories=(abstract, Collision, Object, Physics, SceneComponent, Mobility), MinimalAPI)
class USceneCaptureComponent : public USceneComponent
{
GENERATED_UCLASS_BODY()
/** OPAMP STEREO EYE */
UPROPERTY(EditAnywhere, Category = SceneCapture, meta = (DisplayName = "StereoEye"))
EMyEye StereoEye;
Then some changes to FScene::CreateSceneRenderer within SceneCaptureRendering.cpp.
This checks StereoEye within the SceneCapture component and provides the HMD left/right eyes perspective projection or the default projection if EMyEye::Off.
FSceneRenderer* FScene::CreateSceneRenderer( USceneCaptureComponent* SceneCaptureComponent, UTextureRenderTarget* TextureTarget, const FMatrix& ViewMatrix, const FVector& ViewLocation, float FOV, float MaxViewDistance, bool bCaptureSceneColour, FPostProcessSettings* PostProcessSettings, float PostProcessBlendWeight )
{
FIntPoint CaptureSize(TextureTarget->GetSurfaceWidth(), TextureTarget->GetSurfaceHeight());
FTextureRenderTargetResource* Resource = TextureTarget->GameThread_GetRenderTargetResource();
FSceneViewFamilyContext ViewFamily(FSceneViewFamily::ConstructionValues(
Resource,
this,
SceneCaptureComponent->ShowFlags)
.SetResolveScene(!bCaptureSceneColour));
FSceneViewInitOptions ViewInitOptions;
ViewInitOptions.SetViewRectangle(FIntRect(0, 0, CaptureSize.X, CaptureSize.Y));
ViewInitOptions.ViewFamily = &ViewFamily;
ViewInitOptions.ViewMatrix = ViewMatrix;
ViewInitOptions.BackgroundColor = FLinearColor::Black;
ViewInitOptions.OverrideFarClippingPlaneDistance = MaxViewDistance;
ViewInitOptions.SceneViewStateInterface = SceneCaptureComponent->GetViewState();
if (bCaptureSceneColour)
{
ViewFamily.EngineShowFlags.PostProcessing = 0;
ViewInitOptions.OverlayColor = FLinearColor::Black;
}
// Build projection matrix
{
float XAxisMultiplier;
float YAxisMultiplier;
if (CaptureSize.X > CaptureSize.Y)
{
// if the viewport is wider than it is tall
XAxisMultiplier = 1.0f;
YAxisMultiplier = CaptureSize.X / (float)CaptureSize.Y;
}
else
{
// if the viewport is taller than it is wide
XAxisMultiplier = CaptureSize.Y / (float)CaptureSize.X;
YAxisMultiplier = 1.0f;
}
//OPAMP
if (SceneCaptureComponent->StereoEye != EMyEye::EYE_Off && GEngine->HMDDevice.Get())
{
if (GEngine->HMDDevice.Get()->IsStereoEnabled())
{
//UE_LOG(LogTemp, Warning, TEXT("LEFT MATRIX"));
if (SceneCaptureComponent->StereoEye == EMyEye::EYE_Left)
{
ViewInitOptions.ProjectionMatrix = GEngine->HMDDevice.Get()->GetStereoProjectionMatrix(EStereoscopicPass::eSSP_LEFT_EYE, 90/*FOV CURENLTY UNUSED BY ALL IMPLEMENTATIONS*/);
ViewInitOptions.StereoPass = EStereoscopicPass::eSSP_LEFT_EYE; //IS THIS NEEDED?
}
else if (SceneCaptureComponent->StereoEye == EMyEye::EYE_Right)
{
ViewInitOptions.ProjectionMatrix = GEngine->HMDDevice.Get()->GetStereoProjectionMatrix(EStereoscopicPass::eSSP_RIGHT_EYE, 90/*FOV CURENLTY UNUSED BY ALL IMPLEMENTATIONS*/);
ViewInitOptions.StereoPass = EStereoscopicPass::eSSP_RIGHT_EYE; //IS THIS NEEDED?
}
}
}
else //END OPAMP
ViewInitOptions.ProjectionMatrix = FReversedZPerspectiveMatrix (
FOV,
FOV,
XAxisMultiplier,
YAxisMultiplier,
GNearClippingPlane,
GNearClippingPlane
);
}
FSceneView* View = new FSceneView(ViewInitOptions);
View->bIsSceneCapture = true;
check(SceneCaptureComponent);
for (auto It = SceneCaptureComponent->HiddenComponents.CreateConstIterator(); It; ++It)
{
// If the primitive component was destroyed, the weak pointer will return NULL.
UPrimitiveComponent* PrimitiveComponent = It->Get();
if (PrimitiveComponent)
{
View->HiddenPrimitives.Add(PrimitiveComponent->ComponentId);
}
}
ViewFamily.Views.Add(View);
View->StartFinalPostprocessSettings(ViewLocation);
View->OverridePostProcessSettings(*PostProcessSettings, PostProcessBlendWeight);
View->EndFinalPostprocessSettings(ViewInitOptions);
return FSceneRenderer::CreateSceneRenderer(&ViewFamily, NULL);
}
Next was to create a blueprint with a couple of scenecapture2D components.
And set them to left eye and right eye. Rendering to a couple of 960x1080 render targets.
(ignore the nearclip/farclip/disabletonemapper variables as they are in relation to something else I was working on).
Within the blueprint we need to set the correct IPD for the capture components.
And for testing purposes make it match the hmd position/orientation.
And a postprocess attached to the camera component of my pawn.
Now this works fine with OVR 4.4.0 (the images should match up).
But porting this to OVR 0.6.0.0 caused some issues as they were not lining up.
After a while I realised it is due to the 12 pixel padding that the oculus eyetextures now require.
As a tempory measure setting the padding to 0 pixels in OculusRiftHMD.cpp fixes the issue.
But would need to look at adding the padding and/or possibly changing the horizontal FOV of the projection matrix to compensate.
class FSettings : public FHMDSettings
{
public:
const int TexturePaddingPerEye = 0; //padding, in pixels, per eye (total padding will be doubled) //OPAMP was originally 12
I wont go into anymore detail about how the transport effect was done as this is getting a bit long and I really want to talk about the limitations of this currently ugly hack
and the possibility of Oculus/Epic helping me out with this or providing a proper stereo rig to uses for such effects as transportation and portals.
The main limitation of this hack is the lack of timewarp/latency/prediction. For a relativly short effect like my transporter its not too much of an issue(although its still an issue).
Check out the lag with rapid head movments(video glitches from 0.10-0.15s which is unrelated).
So Im hoping somebody can suggest a way to fix this or a better way of going about.
Or maybe Oculus/Epic would like to save me the trouble of fumbling though there code and doing it for me?