Material Experiments - What's Going on Here

I’ve been trying to wrap my head around UE4’s material system the past few days. One thing I’ve been trying to figure out is how to do the equivalent of CG’s GrabPass to do post-process only on the part of the scene behind a particular mesh.

With a little help from the forums, I learned that SceneColor in a translucent surface material, is the equivalent of CG’s GrabPass, and it worked great until I tried to hook up texture coordinates to it:

(all screenshots can be clicked for larger versions)

http://9.t.imgbox.com/8e9X9OGK.jpg

I posted this problem to the forums a few days ago and got the answer that I needed to use a custom UV slow to pass the texture coordinates to the fragment shader in order to do this. Only, doing that:

http://7.t.imgbox.com/btQrPrNG.jpg

resulted in no change to what I saw:

http://5.t.imgbox.com/ilgsWi9B.jpg

After a little experimentation, I found that using screen position instead of UVs as input:

http://0.t.imgbox.com/wGOoFwGp.jpg

Seemed to give the result I wanted:

http://1.t.imgbox.com/7syjYepx.jpg

That is, until I started manipulating the screen coordinates to get different . If, for example, I scale them:

http://3.t.imgbox.com/XaWLyWH9.jpg

I get exactly what I expect, EXCEPT that my character, who is in front of the geometry, is included.

http://6.t.imgbox.com/rS7xs2Ch.jpg

Similarly, if I try to do a Sims/Saints Row-style censor mosaic effect:

http://9.t.imgbox.com/8i6xnY59.jpg

The character in front of the mesh gets included in that calculation as well:

http://3.t.imgbox.com/SOcKVzfh.jpg

Now, the more I think about it, the more sense this makes. SceneColor is giving me the full, rendered scene and that includes everything regardless of where the stuff in the scene is.

I’m starting to get a feel for how UE4’s material system relates to the underlying shaders that get built, but there are still large parts of it that I’m not fully grokking. SceneColor seems to give me almost exactly what I want, only I have no idea how I might exclude items between the mesh and the camera, which seems like a prerequisite to doing a lot of useful things with SceneColor in a surface material.

Is there a way to use the depth buffer here? Or is there a way to do a shader pass that only includes objects not between the mesh and the camera?

Any thoughts or insight would be much appreciated.

Thanks!

Hi -

The quickest way to solve your problem is to use a render to texture workflow. Add a Scene Capture Actor and the Plane to a Blueprint, with the Plane as the root object in the Blueprint Components. Adjust the position of the Scene Capture Actor to align with the edge of the plane on the facing the reverse side, so the camera will be taking a running video of what the reverse side of the plane would be seeing. Save your Blueprint and create a Render to Target Texture, which you would use instead of the scene color node. Adjust the UVs of that texture as desired in the material and assign that material to the Plan in your blueprint. This would give you the effect you are looking for.

The Scene Color node cannot cull the items from camera to mesh space. You can use the SceneTexture:CustomDepth and assign the mesh to use a custom render depth, but this will cull all other meshes except those in the custom render depth.

Hopefully this will help you, if you need more or could use some visuals let me know and I will get back to you -

Eric Ketchum

Thank you for this information - it helps a lot. I’ve used scene capture components, but was mostly trying to see I could accomplish this purely in a material. Sounds like I can’t, so that’s good enough for me. :slight_smile: