Hello there!
I’ve been playing around a bit with the post opaque render callback. For some reason I can’t seem to be able to access depth information here though.
Looking at the main render code, it definitely seems like it would be accessible as this hook is called after the base pass, but before the post process system where we all know the buffer is available.
Creating an SRV from the FPostOpaqueRenderParameters.DepthTexture (or the FSceneRenderTargets::Get(RHICmdList).DepthTexture for that matter) and binding it should enable me to read from my compute shader by declaring Texture2D<float> and accessing the r component. I’ve tried to cross verify this by looking at the bind the common.usf file uses, and I’ve also verified that the PF_DepthStencil buffer is of the DXGI_FORMAT_R24G8_TYPELESS DX11 format, which should enable this behavior.
I’ve been fighting this for about 2 days now, and since I just can’t seem to crack it I thought I’d ask around if anyone has already done this or can otherwise provide some insight
Here is an excerpt from the compute shader:
//default Texture2D type is <float4>, which won't read the X28_G8UINT format properly.
Texture2D<uint4> CustomStencilTexture2D;
Texture2D<float> SceneDepthTexture;
Texture2D CustomDepthTexture;
RWStructuredBuffer<float> OutputIDBuffer;
#define FLT_EPSILON 0.0001f
[numthreads(IDBUFFER_THREADGROUP_SIZEX, IDBUFFER_THREADGROUP_SIZEY, 1)]
void MainComputeShader(uint3 ThreadId : SV_DispatchThreadID)
{
float SceneDepth = SceneDepthTexture.Load(int3(ThreadId.xy, 0)).r; //This should work since the depth texture seems to be of format DXGI_FORMAT_R24G8_TYPELESS?
float CustomDepth = CustomDepthTexture.Load(int3(ThreadId.xy, 0)).r; //Ends up as zero as well...
int CustomStencil = CustomStencilTexture2D.Load(int3(ThreadId.xy, 0)).w; //This is using their special SRV to access the stencil buffer. I haven't looked much into this, but I assume they create an SRV with the DXGI_FORMAT_X24_TYPELESS_G8_UINT format.
if (ThreadId.x == 1 && ThreadId.y == 1)
{
OutputIDBuffer[0] = SceneDepth; //Always zero... sad face
OutputIDBuffer[1] = CustomDepth; //Always zero...
OutputIDBuffer[3] = CustomStencilTexture2D.Load(int3(ThreadId.xy, 0)).w; //This works fine!
}
}
Since the stencil works fine I can only assume that the depth buffers are valid and dandy, and I’m doing something wrong either when creating my SRV’s, binding them, or having the wrong HLSL definition for them.
Sigh…
Best regards,
Temaran