I just started getting interested in this fantastic realtime world coming from a vfx background (and having to wait for renders hours per frame)
I browsed the forum a bit and aside from Lionhead’s effort to include LPV in this release I didn’t see much work from others, just a lot of complaints about the lack of SVOGI (which was looking very good btw)
So, I just wanted to get out a list of alternative algorithms that I found online, hoping that someone with programming experience, access to the source and a lot of spare time can implement something the way lionhead did.
So far the most interesting stuff I found is this:
Those are just a couple of papers I stumbled on the internet. Usually I find offline rendering stuff (like path-tracing etc) to document myself, but all this realtime thing has gotten my ears whizzing!
Of course anyone is welcome to add his findings.
One thing I wanted to know is, is there any research on faking a path tracing algorithm using GPU particles? Since unreal engine can manage a lot of those easily and they can bounce, and seeing that the rendering engine uses a deferred path, would it be possible to just use those and let them illuminate at each bounce while losing energy? Also, why not store light afterwards in something like a voxel grid for the indirect bounces and let particles illuminate every cell while they traverse it? this way you could emulate participating media and volumetrics.
Just thoughts tho, I love the looks you can get now, especially the reflection system. I was astounded when I saw Remember me use it on the UE 3, and I’m glad you picked up from there and refined that.