Accessing depth buffer from postopaque compute shader

Hello there!

I’ve been playing around a bit with the post opaque render callback. For some reason I can’t seem to be able to access depth information here though.
Looking at the main render code, it definitely seems like it would be accessible as this hook is called after the base pass, but before the post process system where we all know the buffer is available.

Creating an SRV from the FPostOpaqueRenderParameters.DepthTexture (or the FSceneRenderTargets::Get(RHICmdList).DepthTexture for that matter) and binding it should enable me to read from my compute shader by declaring Texture2D<float> and accessing the r component. I’ve tried to cross verify this by looking at the bind the common.usf file uses, and I’ve also verified that the PF_DepthStencil buffer is of the DXGI_FORMAT_R24G8_TYPELESS DX11 format, which should enable this behavior.

I’ve been fighting this for about 2 days now, and since I just can’t seem to crack it I thought I’d ask around if anyone has already done this or can otherwise provide some insight :slight_smile:

Here is an excerpt from the compute shader:




//default Texture2D type is <float4>, which won't read the X28_G8UINT format properly.
Texture2D<uint4> CustomStencilTexture2D;
Texture2D<float> SceneDepthTexture;
Texture2D CustomDepthTexture;

RWStructuredBuffer<float> OutputIDBuffer;

#define FLT_EPSILON 0.0001f

[numthreads(IDBUFFER_THREADGROUP_SIZEX, IDBUFFER_THREADGROUP_SIZEY, 1)]
void MainComputeShader(uint3 ThreadId : SV_DispatchThreadID)
{
	float SceneDepth = SceneDepthTexture.Load(int3(ThreadId.xy, 0)).r;      //This should work since the depth texture seems to be of format DXGI_FORMAT_R24G8_TYPELESS?
	float CustomDepth = CustomDepthTexture.Load(int3(ThreadId.xy, 0)).r;    //Ends up as zero as well...
        int CustomStencil = CustomStencilTexture2D.Load(int3(ThreadId.xy, 0)).w; //This is using their special SRV to access the stencil buffer. I haven't looked much into this, but I assume they create an SRV with the DXGI_FORMAT_X24_TYPELESS_G8_UINT format.

	if (ThreadId.x == 1 && ThreadId.y == 1)
	{
		OutputIDBuffer[0] = SceneDepth;      //Always zero... sad face
		OutputIDBuffer[1] = CustomDepth;    //Always zero...
                OutputIDBuffer[3] = CustomStencilTexture2D.Load(int3(ThreadId.xy, 0)).w;   //This works fine!
	}
}


Since the stencil works fine I can only assume that the depth buffers are valid and dandy, and I’m doing something wrong either when creating my SRV’s, binding them, or having the wrong HLSL definition for them.
Sigh…

Best regards,
Temaran

1 Like

I spent another hour on this and managed to get custom depth working. Unfortunately, scene depth is still not giving in. I was thinking maybe I need to prepare the depth buffer somehow before I can read it. Hmm…

Oh my, completely forgot about this thread. I solved the problem shortly after my last post.
I found that the scenedepth got bound as render depth some calls before my code was called which resulted in my SRV bind silently failing. I was a bit surprised when I realized this did not trigger any API warnings even though I have all logging turned on in D3D11.
Quite frustrating to be honest.

But as always, Renderdoc to the rescue :slight_smile:

Best regards,
Temaran

Sir, you’re always chat with yourself,:p, and you got always anwsers, thank you for share everything, it’s quite help

Hi,do you know how to create depth buffer of PF_DepthStencil format?