Formulation of custom projection matrices in Capture Components

I’m trying to build a custom projection matrix for a USceneCaptureComponent2d so I can support independent Fx and Fy focal lengths. Based on my reading here and here, I understand the general form is an upper triangular matrix that incorporates the focal lengths, principal point, skew, and near/far frustum planes, but it seems quite a bit of this depends on conventions of the underlying graphics systems.

So far I’ve got

CaptureComponent->bUseCustomProjectionMatrix = true;
CaptureComponent->CustomProjectionMatrix = ProjectionMatrix(...);

FMatrix ProjectionMatrix(const FVector2d &FocalLengthPixels,
             const FVector2d &PrincipalPointPixels,
             const FVector2d &ImageSizePixels, const float Skew) {
 constexpr float nearClipPlane = 10.0f;
 constexpr float farClipPlane = 100000.0f;

 const FVector2d Scale = nearClipPlane * 2.0f * FocalLengthPixels / ImageSizePixels;
 const FVector2d Offset = FVector2d(1.0f, -1.0f) * (2.0f * PrincipalPointPixels / ImageSizePixels - 1.0);
 const float Shear = 2.0f * Skew / ImageSizePixels.X;

  // Reversed-Z (UE default) convention: near maps to 1, far maps to 0
  return FMatrix(
          FPlane(Scale.X, Shear, Offset.X, 0.0f),
          FPlane(0.0f, Scale.Y, Offset.Y, 0.0f),
          FPlane(0.0f, 0.0f, 0.0f, nearClipPlane),
          FPlane(0.0f, 0.0f, 1.0f, 0.0f));
}

which replaces a capture component configured with a single focal length:

HorizontalFOV = FMath::RadiansToDegrees (2.0f * FMath::Atan(0.5f * float(ImageSizeX) / focalLengthX))
CaptureComponent->FOVAngle = HorizontalFOV;

This seems to work (it gives the same image as a CaptureComponent configured with FOVAngle + a render target of a known size), but there are a number of things I don’t understand. I’m hoping you can elaborate on the Unreal Engine conventions and check my math. A couple questions:

  1. I haven’t found any resources that indicate `Scale` should contain `nearClipPlane`, though when I don’t have this my image is way too zoomed out. This almost certainly means another part of the matrix is wrong (that `nearClipPlane` on the third row?) Can you tell me what this matrix should be given these parameters?
  2. I use a post process material to apply distortion effects to the final image. This works fine with a “standard” FOVAngle-configured Capture Component, but the lower triangle of the final image is black when I configure the Capture Component with this matrix. If the images are the same from each capture component before post-process distorting, I don’t understand why the projection-matrix image has this defect. How could a custom projection matrix affect post processing?

[Image Removed]

[Image Removed]

[Attachment Removed]

Hi,

I’ve started looking into this and while I don’t know yet exactly why your matrix is not yielding the expected results, I did notice some important differences between your code and TPerspectiveMatrix<T>::TPerspectiveMatrix(T HalfFOVX, T HalfFOVY, T MultFOVX, T MultFOVY, T MinZ, T MaxZ) in Source\Runtime\Core\Public\Math\PerspectiveMatrix.h, which constructs a matrix from FOVX and FOVY:

template<typename T>
FORCEINLINE TPerspectiveMatrix<T>::TPerspectiveMatrix(T HalfFOVX, T HalfFOVY, T MultFOVX, T MultFOVY, T MinZ, T MaxZ)
	: TMatrix<T>(
		TPlane<T>(MultFOVX / FMath::Tan(HalfFOVX),	0.0f,	0.0f,	0.0f),
		TPlane<T>(0.0f,	MultFOVY / FMath::Tan(HalfFOVY),	        0.0f,	0.0f),
		TPlane<T>(0.0f,	0.0f, ((MinZ == MaxZ) ? (1.0f - Z_PRECISION) : MaxZ / (MaxZ - MinZ)), 1.0f),
		TPlane<T>(0.0f,    0.0f, -MinZ * ((MinZ == MaxZ) ? (1.0f - Z_PRECISION) : MaxZ / (MaxZ - MinZ)),	0.0f)
	)
{ }

There is also a TReversedZPerspectiveMatrix<T>::TReversedZPerspectiveMatrix(T HalfFOVX, T HalfFOVY, T MultFOVX, T MultFOVY, T MinZ, T MaxZ) which handles the reverse Z-depth mapping for you.

As for your question regarding why some part of the image renders black, some post-processing effects rely on the scene’s depth. It’s possible that the frustum is set incorrectly and that the near/far plane culling results in z/w values which are invalid (not between 0 and 1) and culled by the GPU before being processed further. Also when the W component is not porportional to the distance from the camera, linear interpolation across the triangle face breaks, which may result in one half of the screen disappearing.

Would it be possible to provide a simple example project for testing?

Thanks,

Sam

[Attachment Removed]