I know that this has been discussed a few times here but, honestly, i still find the current SSGI implementation quite limited and closed. Almost no tweakable. It looks blurry and poorly defined IMHO.
Does anyone knows if there’s any plan of implementing SSRTGI instead? Epic team or as a external plugin. How difficult would be to code and impement this?
Also, I’m doing some interesting approaches getting close to offline renders combining Final Gather and ambient cubemaps with RTAO BUT one more ideal combo would be to combine FG with SSGI (Or SSRTGI of course).
**But currently SSGI cannot be combined with RayTracing GI like FinalGather… Does anybody knows why?? **It’s kinda frustrating. Any ideas? Anyone knows if there is some branch working already on that?
Just in case, here, 3 good examples of what i’m trying to say:
The very worst thing any GI solution can be in this day and age is “Tweakable”. GI should always be just one enable/disable toggle that should give as close to physically accurate results as possible, of course within the frame of given solution. So I’d expect solution for realtime games to be a worse quality than solution for offline rendering. But having GI that has actual multipliers and knobs is just not acceptable for 3rd decade of 21st century. Whenever any modern GI solution has them, it’s simply due to a failure of overcoming a technical limitation of handling those parameters automatically. That’s why Corona Rendered has pretty much overtaken archviz. People realized that less knobs actually result in a win/win combination of better productivity as well as better looking/higher quality results.
To make it clear, I agree that GI should be improved. Even SSGI for that matter. I just don’t believe making it more “tweakable” is the right way to go about it. IMHO the right way is that it should remain just a single on/off toggle, but should simply provide much better results when toggled on
That being said, I agree that if current SSGI does not suffice, it should be improved, especially if there are real examples of it being done better elsewhere.
About your examples though:
The first video looks exactly like how Unreal’s SSGI behaves. I am not seeing much difference. Perhaps try to disable r.SSGI.HalfRes flag to get higher SSGI resolution.
The second video makes it impossible to judge the actual quality of the GI, since there are many textures and details involved, almost looking like specifically crafted to hide GI solution inefficiencies. Again, I am having hard time seeing anything higher quality than what SSGI in UE4 currently offers.
The third thing is SSRT, it’s actual ray tracing, just in screen space. UE4 already has ray traced GI, and when it comes to ray tracing, I don’t think that tracing rays outside of view comes with any significant performance cots.
When I make a simple tests with just one light, either directional or point, I get pretty much same, if not better quality/definition when comparing the examples in that 80lv article:
I am also a bit disappointed with the direction Epic has taken for GI (now I don’t mean SSGI). Especially choice of Final Gather, which, empirically, seems like the absolutely worst way to perform dynamic global illumination. I’d consider neither BFGI nor FG to be anywhere near usable or production ready in this state.
I agree that the ideal solution would be having some dynamic radiance cache for secondary bounces and SSGI as form or “Final Gathering” of the first bounce, rather than the hideous splotchy Mental Ray style Final Gather we got now
If you are after the absolutely highest quality GI, then Luoshuang’s GPU lightmass plugin has proven to be that for me. But of course at the cost of baking the lighting, which won’t work for the usual AlexRomanesque timelapse shots.
Lastly, in the current state, I think that your best bet is combination of raytraced skylight where applicable and Final Gather. BFGI is just too slow.
Anyway, I’d not dwell on this SSGI + FG idea, simply because 4.26 which is about to be released is last UE4 version, and after that, next year, first beta of UE5 will be released with Lumen, which is supposed to be a new, high performance GI system. A future of realtime GI. So I am afraid you are thinking about investing development resources, either yours, or Epic’s, into combining two different GI solution both of which are already in the process of being deprecated in favor of something much better
BTW, are you the real Alex Roman? I am having a bit of a hard time connecting this combination overly excited tone and a bit imperfect English with the same person who wrote The Third & Seventh book
The very same. Flesh and bones! . Hahahaha! 1st of all, thanks a lot for even being the only one paying attention to this thread. Means a lot to me. 2nd, my grammar; hahaha; yup, u caught me!
Will try to be brief. Abosluteky agrre with you at 200%. I’m more calm now and i can see everyone was (more or less thinking of the same thing). Also, been talking with key high people of EPIC working on Lumen and yes, makes no sense to try and make “some magic tricks now”. Too late. Which is really funny of all that is that Lumen (which is NOT raytraced) will work in every GPU. Nvidia is making a lot of efforts on things that are amazing alike Dsll 2.0, RTXGI, RT Caustics that cannot be used in next gen consoles… An BTW; neither of Lumen neither RTXGI i see detailed enough to be compared to the old fashioned offline render. Curious; because i’m tweaking (as much as UE4 let me which is almost nothing) UE4 avail tools and getting to solutions really closed to Corona or V-Ray. NOT real time for gaming but a decent 20fps…
Will post compartives qith “groundtruthed” refs if u wish later.
Again, thanks a lot Rawalanche, let’s wait and keep our fingers crossed…
Yes, they certainly look different, but when I look at them, I don’t consider the UE4 one being too bad of a result for 32 millisecond render. Although I am generally against tweaking, there’s a lot of parameters that can be tweaked currently, such as completely being able to disable realtime denoiser for RTGI, or reduce its strength and brute force it with more samples, but you’d end up with something that’s far from realtime unfortunately.
So if you want something to play with, there’s tons of it:
I’d do test myself, but unfortunately as I don’t have an RTX card, I’d probably not arrive at any meaningful results.