Adaptive-Blended TAA, a tiny magic for your sharp and responsive scenes

I’m affraid this is not gonna help. The problem here is that neither the MovementFactor nor the depth differences algorithm will be unable to figure out where the shadow spot is on the screen. While the moving objects in both cases display some improvement in regards of the ghosts, the end result is hardly justify the solution, since i have seen great amounts of running jaggies unfortunately. Also, once you set up the blur pixel the weighting have to be adjusted to avoid luminance issues. This currently you can investigate how the original author have done it with the Filtered pixel data, which is a well calculated pixel summary of the neighbors.

The presentation from Uncharted 4 actually was talking about the blurred pixels which was applied for the case when the character ghosts were cut out, and instead of the “untreated” pixels it was showing blurred pixels, in order to effectively reduce the aliased pixels at these areas. This currently is part of the Temporal Anti Aliasing solution from epic as well, and it is represented by the Filtered pixel which is being applied once the blend factor require new data instead of the history. The only problem with Epic’s solution that it is presents a sharp pixel information that is prone to contain aliased pixels as well and that should be weighted differently upon movements to have a smooth result instead. It however would be very dangerous to just apply for the entire screen since the end result would display blurred screen which is against the very idea :smiley:

But this actually inspired me of something, thanks to your input! Since the MovementFactor (or DD factor in my case) can not only be used for a final blending, but can may be used for a coefficient to manipulate the pixel weightings, it would be possible to apply different amount of bluring (of a 3x3 kernel) relative to the amount of this factor. Simply put, white areas show more blur, gray is half blur, dark is no blur. Therefore the large portion of the screen can remain sharp while the masked areas a bit blurry but less aliased (compared to the current implementation). This would be something worth investigating and i might just do that in my next session :smiley:

I don’t think we can completely fix ghosting with the current re-use of frames. If we somehow move to only use the last raw frame instead of the blended one we could get the ghosting down to 1 frame. But I guess we lose a lot of smoothness by doing that.

Yes. The only thing that could help here is to use the Lighting only channel shown in the editor to effectively mask the shadow spots, since this viewport is not contaminated by texture informations.
2017-08-28 15_38_26-Loco2 - Unreal Editor.jpg

This is a very interesting information and would be nice to just attach it for the TAA pass to have access to it. I’m just not sure how to do that!