I feel like there’d be a way to do this using postprocess materials. Just use SceneDepth to get the depth value of the pixel; it’s not a 1:1 correlation with the Time value of the trace but it DOES report absolute distance of the pixel from the screen. You can use the depth value to color the pixel and discard/blacken any pixels too far away.
The tricky part (at least for me) would be discarding and merging pixels to reduce the resolution from 1:1 pixel mapping to something low-res. It’d probably be possible with the Screen Size setting in a post-process volume; simply Set that to 25, apply the postprocess material that maps depth to color, and you’d be very close.
Thanks for the reply.
The Depth Expressions page (Depth Material Expressions in Unreal Engine | Unreal Engine 5.3 Documentation ) lists SceneDepth as used only for translucent objects. Would it work in my scenario?
I’ve tried PixelDepth – instead of SceneDepth – with this material set up (using the tutorial implementation here Post Process Materials in Unreal Engine | Unreal Engine 5.3 Documentation ):
But applying the material to the Global PostProcess object on the map simply paints the entire camera view red. Changing the denominator of the division operation doesn’t seem to affect anything either.
Any ideas?