Mantle + VR

I think some people are misunderstanding the benefit mantle brings. Mantle simply reduces the overhead caused by going to the CPU before going to the GPU. This means quicker frametimes and less CPU load by not having to run all commands heavily through the CPU first, which is how DirectX and OpenGL currently do it (though OpenGL I was told has the capability to go low-level a-la mantle/dx12; not sure how true it is).

If you have something like a consumer i7 at 4.5GHz or an extreme-level i7 hexacore at 4GHz, mantle is going to do almost nothing for you (barring horrible coding on the part of the engine it’s being used on). Most of the gains from mantle are displayed using APUs which are both excessively TDP and Heat-limited. When you go to AMD’s FX-8350 and a R9 290 GPU, your mantle gains are something like at most 10fps? My friend with a 4.8GHz FX-8350 and a R9 290 that’s SERIOUSLY overclocked (because he has this thing about trying to get his 290 to beat my two 780Ms in performance) doesn’t even use mantle in BF4 because the colour representation and glitches don’t benefit from the ~10fps bonus he gets.

If virtual reality takes off and CPUs are limiting the heck out of the GPUs for 3D (which they usually don’t; as a user of nVidia’s stereoscopic 3D vision I can 100% confirm this) then mantle and DX12 will help greatly. But basically, the stronger your CPU and GPUs, the less benefit low-level APIs grant, unless the engine is already badly coded. I’m not saying there is NO benefit… don’t get that wrong. But it isn’t going to be enough to make everyone shout its praises from the rooftops.

And again I call it back to Thief 2014, where nVidia actually optimized their drivers so much that AMD card users using Mantle with powerful PCs still can’t outperform nVidia card users with similarly powerful PCs. Why? Because Thief 2014 is not a CPU devour-er.