This is very exciting! Just saw this up on Toms Hardware. Check this out!
This very likely might bring SLI to UE4 but being able to practically use any card from any maker is pretty crazy! This is extremely exciting but could prove a great solution to games really needing the extra performance for VR or for eventual GI implementations like Dan’s new HeightField based method or NVidia’s VXGI. Maybe Daniel can chime in on what he thinks but DX12 looks to be a huge win and a much needed update for the modern architecture of graphics cards and multiple GPU based solutions going forward.
Part of the reason for lack of SLI support was that even using the same engine each game has different requirements which would need its own SLI profile, if this needs something like that, then it might not have support in UE4.
Hey darthviper107! I think another concern was that DanielW mentioned that the maintaining of GPU State would get extremely complex to manage as the engine grows in size however if everything is getting pooled together as if it was managed by a single card, this hopefully won’t have to be the case. They are saying that it is possible since DX12 isn’t paying attention to which GFX card is in the system that SLI or Crossfire may go obsolete or at least not require the underlying tech. If that happens, than hopefully multi-gpu setups won’t require custom profiles per game and the such. It almost sounds as if there will be some sort of virtual graphics driver/device that you will be able to enumerate which will do all the communication to the gfx cards in the system. I’m guessing we will hear more about this at the least at GDC next week. Fingers crossed but it’s good to know Microsoft is thinking about how to more efficiently leverage multi-gpu and it’s definitely about time to see multi-gpu methods updated. I’m curious though how the drivers are going to come into play especially if you had an NVIDIA card pared up with an AMD but hopefully this plays out well and introduces some major benefits.
The concern that I have is with the concept of SFR. If one graphics card is slower than the other one, it could take longer to render one half of the screen. Might not be super noticeable for a standard monitor… or it could be depending on the game / hardware. Microsoft will need to look at that side, and clock the rendering speed down to the lowest common denominator.
But… I can’t be the only one that has this concern, right? I mean… I would assume that the engineers at Microsoft would have thought of this… right?
That’s actually a very good point SaviorNT and I agree. To be honest, the last time I SLI’ed 2 cards were back in the Voodoo Diamond 3DFX days which if I’m not mistaken, did it’s rendering w/ SFR. But the upside is probably paring 3 or 4 GTX 980’s would actually scale much better in benchmarks.
From the article there would have to be work done on the engine side to control how it runs, it’s not automatic. And depending on your game, you may want it to do things differently, that’s the same issue with SLI.
I absolutely agree some work would definitely need to be done. However, they are saying that it would be painless to implement the SFR compared to how AFR is implemented today in game engines. I’m sure if you are going to optimize as much as you can for your game there will definitely be complexities on managing the resources effectively to get the best bang for your buck but it seems like it will make the decision much easier for developers on the fence about implementing it. They also say there is an auto option in there if you want it to manage the resources itself but most devs will likely manage it manually. In regards to UE4, I was more or less referencing DanielW’s quote to post I made back during 4.6 about SLI where Dan says…
Based off this, I’m assuming the major hurdle in UE4 is state management and maybe its possible this will alleviate some of the blocks. I think going forward for VR, multi-gpu will be much more needed and if it can be done without complicating things, I think it would be a great thing to see make it into UE4. Right now it’s speculation and there aren’t many details on how this works in the API but who knows, just positive thinking