Rendering RGB and Depth Images Simultaneously

I’m working on a system to simulate numerous cameras using SceneCapture2D components. I would like to render both an RGB image and a depth image for each camera. Each render-target’s pixel buffer will be read and the data streamed over a network (most likely never being displayed on-screen). Is there any way to accomplish this with a single render pass? Essentially I would like to fetch the z-buffer after generating the RGB image. Can this be done without modifying the UE4 source code?

I’d appreciate any suggestions. Thank you.

No, you would have to alter the source code. But how would you do this in a single render pass?

I’m not sure what you would like to do with the depth buffer data, but you might be able to store a “faked” depth buffer in the alpha channel if you don’t end up using any transparency in the game.

You also probably don’t want to push the entire texture over a network. Whatever you want to check in it should probably be done locally.

Good luck!

Thank you. I ended up altering the source code as you suggested.

I’m using UE to create datasets for a robot perception pipeline. So I wanted to generate color images and depth maps to be used either offline or on another system.

Hi, mkaspr,
Could you share some details about how to get the z-buffer by altering the source code?