@DanielW, in the stream you said that Epic is not planning to support both Deferred and new Forward renderer on same packaged build due to long shader compile times. If however, one would be willing to wait the extra time, would there be some way to do it without fully rebuilding the system?
I saw that current forward renderer is mixed with deferred code so there doesn’t appear to be an easy way to set a separate target RHI for it (and the setting we have now just toggles between). I see that UE4 does support Switch on both forward and deferred on target RHI’s, I’m still kind of curious why the shader compilation times are issue as people who really need both would surely be willing to wait the time? And it wouldn’t really hurt the others as it would be optional.
Just to give this some more context, I have a game in development that is mainly targeted at desktop but will have additional VR mode. I’d love to use deferred as renderer on desktop and forward for low end and VR (to get MSAA) but since swapping between these isn’t possible atm, I now need to make the game work in forward only. I’m sure that almost all games that have separate desktop and VR mode will be somewhat in the same situation as I am.
I meant that we don’t support forward and deferred for the same platform simultaneously, which is mostly a code complexity issue (most shaders would have to be compiled twice).
Technically if you can change the value of r.ForwardShading per platform, you can do forward or deferred on each platform within the same project. There are some gotchas - when compiling shaders you have to look at the right platform’s setting. So this would not be hard to implement.
There are some challenges supporting both rendering paths within a single project though. The forward and deferred rendering paths are different enough that you have to create content knowing which renderer you will be on to some degree. For example, certain meshes will alias badly with MSAA, materials have fewer samplers available, certain rendering features don’t work in forward, masked materials work a bit different with AlphaToCoverage, etc.
In summary we still only support choosing the rendering path per project, but we might change that in the future. You bring up a good point for desktop games that want to have a VR version instead of being a VR game.
Adding my support for this 100% as well. I’ve been planning a hybrid VR game since before VR was viable. A multiplayer game that supports both VR and Standard Clients simultaneously, giving VR players both unique advantages and disadvantages. Given the limits of Forward Rendering I’ve simply planned to wait, counting on the fact that hardware will continue to improve until all features can be supported without it; however any improvements on the software side are gladly welcomed.
I think what said was that they might make it possible in the future to switch between forward and deferred, like 0lento needs it for a VR mode. That wouldn’t automatically mean though that you can use it simultaneously, I think that would mean quite a few changes to the way UE4 renders a scene.
And hey @DanielW, I would still love to finally see SSAO support in forward
You said it would be “easy” to add. I actually made Nvidias HBAO+ work with the forward renderer, there it was really quite simple to let it output to the IndirectOcclusionTexture. Then I noticed that the performance of HBAO+ is really bad in VR though, while SSAO is ~0.45 ms (both eyes), HBAO+ is ~2.5 ms which is way too much unfortunately. So I had to go back to deferred… It definitely showed me how optimized UE4s SSAO is.
Since adding forward support for HBAO+ was so easy, I also tried to add forward support to SSAO, but thats just way harder it seems to me. The SSAO is currently calculated together with all the other post processing in that rendering composition graph, but for making it work with outputting to the IndirectOcclusionTexture, the SSAO has to run before the base pass I think. So it needs to take the depth buffer from the prepass and use that to output to the IndirectOcclusionTexture before any post process stuff has run. I failed at decoupling the SSAO from the render graph. It just relies on a lot of things that just aren’t there yet before the base pass, and the SSAO was not designed to be ever run outside of that graph I think.
Did I miss something or is it just not really that easy?
Remove it from the composition lighting PP graph if forward shading
Add it to a new graph which is run at the right part of the frame (same spot indirect capsule shadows are run)
Modify SSAO to blend with indirect capsule shadows / clear the texture if first
Make sure the version of SSAO which only relies on depth is used (no GBuffer normal available)
Everything so far is straightforward.
Then after all that, it may turn out that SSAO is much too noisy because it relied on temporal AA. Making it high enough quality is an open ended task and probably the most difficult part of the whole thing.
I don’t have a lot of time to spare right now unfortunately, deadlines everywhere.
Oh, thanks very much for these steps! I didn’t know that there would be multiple graphs, that makes it a lot nicer. I wish there would be more (any) documentation about how the different classes in the UE4 renderer work together and where what is done.
I only want to use it together with temporal AA. I’m not sure what exactly you mean, that it will look bad without TAA or actually that it will look worse with forward + TAA than with deferred + TAA? Will the TAA not smooth it out the same in forward like in deferred?
Sorry I wasn’t clearer. I meant simultaneous clients in the context of a server based game. One client could be using a forward rendering and the other could be deferred. It would be nice to be able to switch between them without restarting/compiling different builds though. Sounds like that might not be good enough either, since other settings are not universally compatible between the two. Though honestly, I kind of hope one or the other becomes obsolete in the future, because otherwise it’s like waiting for console tech to catch up.
I still remember being excited about the old GDC Kite demo and DFGI, and now that tech is in unfinished state sitting in the backlog. During GDC this year one of my favorite parts when when mentioned that “Thanks to implementing this much requested feature in-between projects, we can now do all of this!” Just left with this feeling that the engine is being focused(officially) on supporting specific games that depend on static worlds that can be largely precomputed. Nevermind dynamic AO and GI, How about shadow performance from a single directional light?