I’ve wrote material with HLSL, that’s was not difficult at all I should mention.
As before image needs to be passed twice: vertically and horizontally. And separately. But it makes cool effect if these passes used simultaneously.
For vertical pass (horizontal commented with /**/):
int TexIndex = 14; // Can be either 13... I think
bool bFiltered = true; // Can be false
float3 blur; //= SceneTextureLookup(UV, TexIndex, bFiltered);
//Vertical pass
for (int i = 0; i < cycles; i++)
{
float c = 1.0f / (3.14159f * amount);
float e = -(i * i) / (amount);
float falloff = (c * exp(e));
blur += SceneTextureLookup(UV + float2(i * rx, 0), TexIndex, bFiltered) * falloff;
blur += SceneTextureLookup(UV - float2(i * rx, 0), TexIndex, bFiltered) * falloff;
}
//Horizontal pass
/*for (int j = 0; j < cycles; j++)
{
float c = 1.0f / (3.14159f * amount);
float e = -(j * j) / (amount);
float falloff = (c * exp(e));
blur += SceneTextureLookup(UV + float2(0, j * ry), TexIndex, bFiltered) * falloff;
blur += SceneTextureLookup(UV - float2(0, j * ry), TexIndex, bFiltered) * falloff;
}*/
//blur /= 2 * cycles + 1;
return blur;
And material nodes:
I also used Gaussian falloff (normal distribution) which defined with next function:
float gauss(float x, float amount) {
double c = 1.0 / (2.0 * 3.14159265359 * amount);
double e = -(x * x) / (2.0 * amount);
return (float) (c * exp(e));
}
Where x - loop iteration number, amount - exponential amount (the less this number - the less blurriness will be).
Wikipedia: Normal Distribution
If talk about physical reinterpretation, then mostly it is how blur materials will distribute light, so it might look very believable.
There is a little hack with image brightness, because integral of normal distribution is… well…
Result:
NOTE: If you want to use this with transparent material, the you need to use Texture2DSample(Tex, TexSampler, UV) instead of SceneTextureLookup… Or something that contains image behind object and works with one of these samplers.
Needs optimization! Need to downscale input resolution by 2 or 4 times, which will greatly increase performance, because 50 iterations is now gets 5-6 ms in 1080 - it’s bad.
I wish there will be CUDA, because then I can precalculate normal distribution once and use it as const array, which will cut 90% of calculation time (power is very expensive operation). Is there any buffers or something like that?
Actually THE BEST optimization will be Fast Fourier Transform, but I don’t find it possible anyhow.
We easily can make blur dependant from depth by “regulating” variable “cycle” with depth. The other question is how we can make it.
@Antidamage You know, we can even make blur under UMG using rendertarget! That’s is a cool idea! I’am gonna make it, but little bit later, university takes a lot of time.
Material is free to use and edit :o