I need to capture scene depth for my level, top down view. I am using Scene Capture 2D component placed on the level and this is what I am getting:
When I worked with depth in other engines and 3D apps, it was always a grayscale image with smooth transitions (kinda like a heightmap). I am not quite sure why it’s coming out like a stencil mask in UE4. If I capture scene color, the image comes out the way it suppose to come out.
So that’s the first issue I need to resolve.
The second question is once I have a proper depth image, how do I sample it in BPs or Niagara 's Sketch ? I need to get world Z value for the depth pixels.
This is expected. Unreals base unit is centimeters, which means 0-1 (the maximum range that can be displayed on your monitor) is only 1 centimeter. It’s not meant to be human readable. Almost the entirety of the scene is outside the visible range, but assuming you are using a 16+ bit render target the data is still there.
Textures can be sampled from Blueprint using the node (I think it’s called) ReadRenderTargetPixel. It is insanely slow, and not suitable for realtime.
Niagara has a texture sampling module, you can open it up and look at it. Sampling textures can only be done in GPU emitters.
I am using I am using 16 bit Red channel only render target. When I export it to HDR and open it in GIMP (which works with 32bit images), Red channel is just plain white.
My level is 5km^2 large. What settings in the Scene Capture 2D do I need to use to capture depth properly (I only need to capture it once, in the Editor or PIE and then sample that render target in runtime) ?
It returns the sampled color given a UV coordinate. Assuming this is a top-down projection of your entire level you should be able to just divide the particle XY coordinate by the length/width of the capture area (in world units) and then adding 0.5
It’s been a while since I’ve worked with it but there are examples of this in the Niagara maps in the Content Examples