Hi, I want to get a depth map from a specific view. And I want it as texture because I want to be able to access the data in C++.
So I did the following setup. I placed a SceneCapture2D in the view I want. In the rendering settings it should render the scene into rgb and the depht into a-channel of a rendertarget:
Ok after a little research I think I might know what the problem is. The Depth Stream (comming from SceneDepth or PixelDepth or whereelse) is always the distance in cm. So, the every pixel in the depth scene has a value > 1 because its at least a few cm away - but when you cast this to a color, you just see white.
I have two options now: finding an option in SceneCapture2D where you can set a depth divider (to get the max depth and a value from 0 to 1 - see here: Scene Depth capture does not work - Rendering - Epic Developer Community Forums)
Or use some code or blueprint to access the pixel values and divide them manually (or at least see whether I am right anyway).
Ok yes it was the problem with alpha values > 1 because the depth was > 1 cm.
I fixed it by creating a second material using only the alpha channel (where the depth value is written), divide it by my maximum distance I want and LERP to get a white is close, black is far image:
Great solution, man! Thanks for sharing! Allow me to make just a little adjust, using a “Remap Value Range” node, instead of “Divide”. It allow us to set the max and min targets.