I’m trying to figure out how to do something complicated. I have about an intermediate understanding of UE4 shaders, but this seems over my head.
I’d like to use some sort of camera–presumably a Scene Capture 2D actor–and ultimately transform that output of that camera into a realtime normal map. That is, I’d like to be able to place something like a Scene Capture 2D in my scene, use the texture target of that scene capture to create a live normal map that would resemble the scene captured scene, and then feed that normal map into the normals of some material. The output would hopefully look like a flat surface that’s been embossed with the shapes and objects of my initial scene, but the embossed shapes would move and change, realtime, depending on whatever’s happening in the initial scene.
Ooo! I tried plugging things in, and this does get me closer, but the effect is so subtle that it’s barely discernable; it doesn’t quite look embossed, as it doesn’t look nearly strong enough. This, for instance, is a render target of the premade third-person player level that’s been plugged into a wood material’s normals:
Oh! Yes, I do know how to do that, but I don’t want a normal map like that to be the final product. I’d like to use a normal map like that to change the normals of a different material, something like a pane of glass (or the wood that I showed above). The problem is, if I just plug the render texture into the final material’s normal node then the effect will be too subtle (as seen in the wood example) to be able to follow any new movement from the render texture.
You can use the custom post process material for this (scene texture node). It explained in this doc under “The Material Expression “SceneTexture”“ section.