How to capture scene depth into an image and then sample it?

I need to capture scene depth for my level, top down view. I am using Scene Capture 2D component placed on the level and this is what I am getting:

When I worked with depth in other engines and 3D apps, it was always a grayscale image with smooth transitions (kinda like a heightmap). I am not quite sure why it’s coming out like a stencil mask in UE4. If I capture scene color, the image comes out the way it suppose to come out.
So that’s the first issue I need to resolve.

The second question is once I have a proper depth image, how do I sample it in BPs or Niagara 's Sketch ? I need to get world Z value for the depth pixels.

I am using UE 4.27.2, forward rendering.


This is expected. Unreals base unit is centimeters, which means 0-1 (the maximum range that can be displayed on your monitor) is only 1 centimeter. It’s not meant to be human readable. Almost the entirety of the scene is outside the visible range, but assuming you are using a 16+ bit render target the data is still there.

Textures can be sampled from Blueprint using the node (I think it’s called) ReadRenderTargetPixel. It is insanely slow, and not suitable for realtime.

Niagara has a texture sampling module, you can open it up and look at it. Sampling textures can only be done in GPU emitters.

1 Like

I am using I am using 16 bit Red channel only render target. When I export it to HDR and open it in GIMP (which works with 32bit images), Red channel is just plain white.
My level is 5km^2 large. What settings in the Scene Capture 2D do I need to use to capture depth properly (I only need to capture it once, in the Editor or PIE and then sample that render target in runtime) ?

Yes, and the output is correct. Again, it’s not meant to be human readable. You can’t export it and expect it to look right, all the values are still too high to be displayed.

If you want to bring it into a visible range then you will have to divide it by a large number. This is useful for debugging but completely unnecessary for sampling it:

1 Like

Maybe I can use this data to construct blocking volumes (not sure if I can “spawn” a volume in Construction script)

GPU-only, ouch :frowning: Do you happen to know how it works ? Basically, I need to get value of a pixel that is closest to a given particle in the world coordinates.

It returns the sampled color given a UV coordinate. Assuming this is a top-down projection of your entire level you should be able to just divide the particle XY coordinate by the length/width of the capture area (in world units) and then adding 0.5

It’s been a while since I’ve worked with it but there are examples of this in the Niagara maps in the Content Examples

1 Like

Unfortunately that doesn’t work for me - my material that should show render target shows solid gray color (and when I change denominator, it just goes from white to gray to black).

Finally got it working (those faint dots are trees :slight_smile: )!

Had to use 32f instead of 16f.

1 Like