I’m creating a custom fog which needs to work under and above water.
This means that stuff should be rendered in this order:
opaque above + under water geometry + apply fog to it in post processes
render water in translucent pass
apply fog on water
But what happens is this:
opaque above + under water geometry
render water in translucent pass
apply fog on opaque geometry in post processes
apply fog on water
The issue is in water distortion, because the distortion effect uses the original scene color texture to look-up the texture which it distorts and not the fogged one: so here it would be useful to have a post processes material which would be injected between the opaque and translucent rendering pass, so that the water distortion effect would sample from the post-processed render target instead of the original one, which causes artifacts.
Is there any chance anything like this would be implemented in the future?
EDIT: the issue is that the “Before Translucency” material flag is a bit misleading.
@Deathrey the “Before Translucency” post process material flag is quite misleading: it happens AFTER translucency has been rendered.
The “Before Translucency” flag is in the context of the post processing pipeline, NOT in the context of the entire rendering pipeline. So “Before Translucency” only means "apply this post processes material before ALREADY RENDERED translucency gets DOF applied in the post processing stage.
I see absolutely no difference if the post process pass is applied before the thing was rendered or before the thing was composited in. Just mind, that it has effect only on items, rendered in separate translucency pass.
The issue is that the built-in water distortion effect uses the “scene color” texture which it samples at offsets to create a refraction effect.
What I’d need is that the translucency distortion would use the “post processed” scene texture which already has fog applied and NOT the “scene color” texture, which is without fog, since the fogged scene needs to be distorted.
I can probably use a similar thing as he does, though ideally the translucency distortion would use a already processed render target as the input texture that it distorts.
The guy is exactly the person you were talking too in the post ^^ he surely knows what he is talking. As for your case isnt the case of blending both PP?
No, the issue is not the texture that I could get when processing water material but the texture that the hardcoded refraction shader uses.
The method from your post works, thanks for that post!
Though now I assume that I’m paying twice the cost for refraction? Once from the custom material node and once from the hardcoded refraction in Unreal?
Perhaps I’ll try to use a opaque material with different order, though not sure if I can use scene color there…