Mantle + VR

Hi guys,

I’m curious why Epic has no future plans to support Mantle? Especially for VR it seems like a perfect fit. AMD Mantle API support and new gen GPUs - Programming & Scripting - Unreal Engine Forums

With the DK2 having a refresh rate of 75Hz you need to stay above the 75 fps to have a pleasant experience (whit low persistence enabled, which you want to get the most out of your experience). Even just slightly dipping below that value will give you an unpleasant feeling and kill the immersion.

For the upcoming CV1 of the Oculus Rift the refresh rate is speculated to be 90Hz or even higher, meaning you would need at least 90 fps to stay in the comfort zone. Not average fps but minimal fps!

There are a lot of benchmarks out there that show that Mantle greatly helps to improve the fps, especially the minimal fps by removing the CPU overhead. So complex scenes might become a possibility in UE4 for the CV1.

With Mantle being an open standard, Nvidia could easily adapt their drivers and also benefit from it. At the moment they don’t want to, but I feel like if Epic would jump on the bandwagon it could be the final straw to make them change their minds and all would benefit. The list of support for mantle is growing http://videocardz.com/51018/exclusive-upcoming-games-support-mantle

Of course there is DirectX12 on the horizon (end of 2015). But with Microsoft’s background to tie feature updates to new versions of Windows, I am not really looking forward to that. You will most definitely have to update to Win9 to take advantage of it. After the fiasco that Win8 turned out to be, I’m not sure I want to know what Win9 holds in store for us (Win 8 is having a user base of 12,54% almost 2 years after the release Windows 8 OS usage declines while Windows XP usage increases – Storage Servers ).

I would be happy to stay on Win7 and Mantle would me allow to do so, but still benefit from future developments in this area.

Epic has said before they aren’t interested in adding support for things that aren’t supported on all graphics cards, since Mantle only works on AMD GPUs it’s not likely that they are interested in adding it.

Alternatively, DirectX 12 is supposed to improve performance and is supported on all brands of graphics cards so that will most likely be added very quickly.

Benchmarks I’ve seen aren’t quite as conclusive as you claim. Increase the resolution to 2560x1440 or 4k and the Mantle advantage twindles. Yet those resolutions are exactly what we need in VR. So I cannot really agree with your assessment that Mantle + VR would be a great fit. I’m not saying it would be bad, but is it really worth it to dedicate considerable development time to support it in UE4 at this point before even the main gaming video card manufacturer jumps on the bandwagon? I don’t think so.

Meh, edited out unneeded stuff.

I agree that Mantle seems like a perfect fit for VR, and one of the best things that could happen to get some Mantle traction would be that Unreal Engine 4 supported it.

DirectX12 could be nice and dandy for Windows when it’s out, but what about Linux? If VR is to become mainstream one of the key factors could be Valve’s Steam Machines, that people could just plug and play.

Since UE4 is Open Source, what i really don’t understand is why can’t AMD add Mantle support themselves? Creating and maintaining a UE4 Mantle branch?

After all i believe the developers of Thief added Mantle support to their custom UE3 game by themselves, i guess it wouldn’t be so hard for AMD to do… (or would it?).

UE4 is not Open Source, the source code is available to subscribers which is very different. If AMD wanted to add support then that might be an option, but they would have to continually keep it updated and that could be difficult depending on what Epic would have plans to do with the engine.
Considering the lack of support from the industry and the lack of promotion by AMD I doubt that something like that would happen.

That is not correct. Mantle would work very well with Nvidia GPUs, if Nvidia decides to support it. As opposed to Nvidia’s Gameworks closed libraries, which would hurt AMD quite alot, if Epic decides to integrate those into the engine.
Most users will still be on Win7 when DirectX hits the market. Those will most likely not profit from DirectX 12, but they would from a Mantle API. DirectX 12 is also no option for the Linux and Steam OS users.

Achieving 90+ fps in 1440p+ in a graphically taxing environment, will, for most users, just be unachievable. Most people will not invest that much money in their PCs. So we will have to learn from the consoles and upscale the resolution to match the native resolution of the rift. You would still benefit from the higher clarity and absence of a screen door effect while maintaining high enough frames rates. That’s when Mantle could make a difference!

Does Mantle work on Nvidia GPUs? No, it doesn’t. It doesn’t matter if it could it doesn’t and that’s how it is at the moment. Epic isn’t going to add support for it on the off chance that it might. It’s likely that DX12 will work with Windows 7, and since DX11 cards will work with it along with the Xbox One, it’s likely going to get support very quickly.

In Frostbite 3, Mantle is an optional feature - so those with AMD GPUs can use it, and those with inferior nVidia cards can still play the game as normal, without really missing out on anything (unlike Gameworks technology, which would drastically change the gameplay experience for AMD vs NV customers)

Surely a similar option in UE4 would please everyone?

Having it exist isn’t a problem, the issue is that it doesn’t work on both Nvidia and AMD, since it only works on AMD they have said they don’t want to add features that only work on one type of graphics card.

As an engine developer I’d be wary to add such technology in general. Do we really need a third API instead of focusing on the ones we have?

This may be interesting too: http://vr-zone.com/articles/john-carmack-mantle-became-interesting-dual-console-wins/61108.html

Well i think it’s looking more and more likely that DX12 and OpenGL Next will simply copy most of the features (and philosophy) of Mantle, and Mantle will be just become a transitional state…

Looks like Johan Andersson himself (from Frostbite, one of the main creators/promoters of Mantle) has focused his attention to OpenGL Next:

Let’s just cross fingers that both APIs do a really good job of reinventing themselves.

I think some people are misunderstanding the benefit mantle brings. Mantle simply reduces the overhead caused by going to the CPU before going to the GPU. This means quicker frametimes and less CPU load by not having to run all commands heavily through the CPU first, which is how DirectX and OpenGL currently do it (though OpenGL I was told has the capability to go low-level a-la mantle/dx12; not sure how true it is).

If you have something like a consumer i7 at 4.5GHz or an extreme-level i7 hexacore at 4GHz, mantle is going to do almost nothing for you (barring horrible coding on the part of the engine it’s being used on). Most of the gains from mantle are displayed using APUs which are both excessively TDP and Heat-limited. When you go to AMD’s FX-8350 and a R9 290 GPU, your mantle gains are something like at most 10fps? My friend with a 4.8GHz FX-8350 and a R9 290 that’s SERIOUSLY overclocked (because he has this thing about trying to get his 290 to beat my two 780Ms in performance) doesn’t even use mantle in BF4 because the colour representation and glitches don’t benefit from the ~10fps bonus he gets.

If virtual reality takes off and CPUs are limiting the heck out of the GPUs for 3D (which they usually don’t; as a user of nVidia’s stereoscopic 3D vision I can 100% confirm this) then mantle and DX12 will help greatly. But basically, the stronger your CPU and GPUs, the less benefit low-level APIs grant, unless the engine is already badly coded. I’m not saying there is NO benefit… don’t get that wrong. But it isn’t going to be enough to make everyone shout its praises from the rooftops.

And again I call it back to Thief 2014, where nVidia actually optimized their drivers so much that AMD card users using Mantle with powerful PCs still can’t outperform nVidia card users with similarly powerful PCs. Why? Because Thief 2014 is not a CPU devour-er.

Confirmation that the Rift will be running at 90Hz. Palmer Luckey on Oculus VR's Content-Focused Future - YouTube skip to 3:15

Ok then let us just hope that Epic can optimize this engine pretty good. At the moment you take a massive performance hit in VR mode. And fps are not that high to begin with in normal mode.

Otherwise I cannot see UE4 being a suitable engine for VR, if even a GTX 780 Ti fails to achieve enough fps in some demo projects. I think it is safe to say that the average consumer will not have a stronger graphic card by 2015/2016

Achieving 90 fps with timewarp and more than 1080p is going to be a challenge in any engine. I don’t think the problem lies mainly with the engine or API but the nature of the VR technology. I’d almost say that Oculus are aiming a little too high for a first consumer version. I somehow doubt that this will be released in 2015 when the vast majority of people won’t have adequate hardware to run the thing, unless Oculus is able to somehow optimize it so much that running at a lower fps and/or lower resolution won’t completely destroy the experience. Oculus seem so hellbent on reducing latency that they seem to forget that people will need NASA hardware to run it.

Of all the demos I tried, most of them made in Unity, the only 2 that run acceptably at the moment are the desktop demo from the config util and HL2. Everything else judders for whatever reason. Sometimes I clearly can’t reach a constant 75 fps, other times it may be a buggy SDK/timewarp that’s causing it. That’s with i7-4770K and a GTX 670, not the best but not exactly low end either.

What is it about the CV1 that means the current DK2 pleasant rate of 75Hz is no longer adequate?
I thought going up to 90 just meant that it can be better if reached, but not exactly required.

V-sync is required at the moment, but they may go back to a vertical screen for the CV1.

Cryengine, Frostbyte and others support Mantle or will support in near future. VR could really benefit from it as you can more then double draw call’s when doing high FOV 3D. This is where mantle come to rescue. There is also possible to use dual rendering when doing MultiGpu and thus not get that latency penalty from AFR.

+Will really help 3D rendering as Stereo 3D rendering can double(even more when high FOV like VR) your draw call’s and Mantle can push 9x more draw call’s.
+Can do way better MultiGpu
+nVidia can support it too as it is going open this year(target is end of this year)
+Migrate to DX12 will be easy
+AMD share all Mantle code with Khronos group. May or may not get OpenGL Next to share some code.

-nVidia do not support it at the moment but may add support…

So +1 For Mantle support as Unreal engine is no.1 VR engine at the moment!

Yet they have supported PhysX for years. Seems like a bit of a strange statement for them to make.

They support PhysX but it works on both cards, using the CPU. Normally PhysX gets GPU acceleration if you’re using an Nvidia GPU and that is not supported in UE4.

NVidia acquired PhysX in 2008 and then intergrated into their own CUDA system. It’s hardly comparable to the current Mantle situation. PhysX continues to run on all systems, just not hardware accelerated.