I’m trying to make a chromatic aberration effect that only affects things that are beyond a certain distance from the camera. I’ve got the chromatic aberration working, and the objects within the range aren’t aberrating, but when the background aberrates, it’s grabbing the pixels of the object within the range and distort them, ruining. What I need to do is make the objects within my range invisible so that the effect can get what’s behind the and distort that before I make the objects visible again. So is it possible to set the opacity of things based on a 0 to 1 image in a post process? I know that if it is, I’ll have the effect I’m after.
The pictures below show my blueprint, the depth image I’m working with, and result of the post-process as it is now.
Right. The thing is, I don’t want things within the range to affect the aberration of things outside it. The reason I want to make the foreground invisible is so the background can be distorted as if there’s nothing in front of it, and then I can layer the foreground elements on top, having not affected the background.
You have a render pass (an image), so you only can get and modify data within it. It’s like Photoshop filters. You could cut out the foreground and fill that area with some color and then use the Chromatic filter, but it still will be awkward looking image. It’s not like foreground is affecting background, it’s just background have to work with the same pixels, there are no information what’s behind the character.
You may read how that was made for DOF, and make own C++/HLSL implementation:
In addition to sampling scene texture at offset location, sample scene depth and compare it with pixel depth. Reject the sample, if depth difference is great.