Announcement

Collapse
No announcement yet.

Mantle + VR

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • replied
    NVidia acquired PhysX in 2008 and then intergrated into their own CUDA system. It's hardly comparable to the current Mantle situation. PhysX continues to run on all systems, just not hardware accelerated.

    Leave a comment:


  • replied
    Originally posted by Opamp77 View Post
    Yet they have supported PhysX for years. Seems like a bit of a strange statement for them to make.
    They support PhysX but it works on both cards, using the CPU. Normally PhysX gets GPU acceleration if you're using an Nvidia GPU and that is not supported in UE4.

    Leave a comment:


  • replied
    Originally posted by darthviper107 View Post
    Epic has said before they aren't interested in adding support for things that aren't supported on all graphics cards, since Mantle only works on AMD GPUs it's not likely that they are interested in adding it.
    Yet they have supported PhysX for years. Seems like a bit of a strange statement for them to make.

    Leave a comment:


  • replied
    Cryengine, Frostbyte and others support Mantle or will support in near future. VR could really benefit from it as you can more then double draw call's when doing high FOV 3D. This is where mantle come to rescue. There is also possible to use dual rendering when doing MultiGpu and thus not get that latency penalty from AFR.

    +Will really help 3D rendering as Stereo 3D rendering can double(even more when high FOV like VR) your draw call's and Mantle can push 9x more draw call's.
    +Can do way better MultiGpu
    +nVidia can support it too as it is going open this year(target is end of this year)
    +Migrate to DX12 will be easy
    +AMD share all Mantle code with Khronos group. May or may not get OpenGL Next to share some code.

    -nVidia do not support it at the moment but may add support...

    So +1 For Mantle support as Unreal engine is no.1 VR engine at the moment!

    Leave a comment:


  • replied
    Originally posted by maxGD View Post
    With the DK2 having a refresh rate of 75Hz you need to stay above the 75 fps to have a pleasant experience...

    For the upcoming CV1 of the Oculus Rift the refresh rate is speculated to be 90Hz or even higher, meaning you would need at least 90 fps to stay in the comfort zone. Not average fps but minimal fps!
    What is it about the CV1 that means the current DK2 pleasant rate of 75Hz is no longer adequate?
    I thought going up to 90 just meant that it can be better if reached, but not exactly required.

    V-sync is required at the moment, but they may go back to a vertical screen for the CV1.

    Leave a comment:


  • replied
    Achieving 90 fps with timewarp and more than 1080p is going to be a challenge in any engine. I don't think the problem lies mainly with the engine or API but the nature of the VR technology. I'd almost say that Oculus are aiming a little too high for a first consumer version. I somehow doubt that this will be released in 2015 when the vast majority of people won't have adequate hardware to run the thing, unless Oculus is able to somehow optimize it so much that running at a lower fps and/or lower resolution won't completely destroy the experience. Oculus seem so hellbent on reducing latency that they seem to forget that people will need NASA hardware to run it.

    Of all the demos I tried, most of them made in Unity, the only 2 that run acceptably at the moment are the desktop demo from the config util and HL2. Everything else judders for whatever reason. Sometimes I clearly can't reach a constant 75 fps, other times it may be a buggy SDK/timewarp that's causing it. That's with i7-4770K and a GTX 670, not the best but not exactly low end either.

    Leave a comment:


  • replied
    Confirmation that the Rift will be running at 90Hz. https://www.youtube.com/watch?v=Tira...ature=youtu.be skip to 3:15

    Ok then let us just hope that Epic can optimize this engine pretty good. At the moment you take a massive performance hit in VR mode. And fps are not that high to begin with in normal mode.

    Otherwise I cannot see UE4 being a suitable engine for VR, if even a GTX 780 Ti fails to achieve enough fps in some demo projects. I think it is safe to say that the average consumer will not have a stronger graphic card by 2015/2016
    Last edited by maxGD; 08-17-2014, 07:30 AM.

    Leave a comment:


  • replied
    I think some people are misunderstanding the benefit mantle brings. Mantle simply reduces the overhead caused by going to the CPU before going to the GPU. This means quicker frametimes and less CPU load by not having to run all commands heavily through the CPU first, which is how DirectX and OpenGL currently do it (though OpenGL I was told has the capability to go low-level a-la mantle/dx12; not sure how true it is).

    If you have something like a consumer i7 at 4.5GHz or an extreme-level i7 hexacore at 4GHz, mantle is going to do almost nothing for you (barring horrible coding on the part of the engine it's being used on). Most of the gains from mantle are displayed using APUs which are both excessively TDP and Heat-limited. When you go to AMD's FX-8350 and a R9 290 GPU, your mantle gains are something like at most 10fps? My friend with a 4.8GHz FX-8350 and a R9 290 that's SERIOUSLY overclocked (because he has this thing about trying to get his 290 to beat my two 780Ms in performance) doesn't even use mantle in BF4 because the colour representation and glitches don't benefit from the ~10fps bonus he gets.

    If virtual reality takes off and CPUs are limiting the heck out of the GPUs for 3D (which they usually don't; as a user of nVidia's stereoscopic 3D vision I can 100% confirm this) then mantle and DX12 will help greatly. But basically, the stronger your CPU and GPUs, the less benefit low-level APIs grant, unless the engine is already badly coded. I'm not saying there is *NO* benefit... don't get that wrong. But it isn't going to be enough to make everyone shout its praises from the rooftops.

    And again I call it back to Thief 2014, where nVidia actually optimized their drivers so much that AMD card users using Mantle with powerful PCs still can't outperform nVidia card users with similarly powerful PCs. Why? Because Thief 2014 is not a CPU devour-er.

    Leave a comment:


  • replied
    Well i think it's looking more and more likely that DX12 and OpenGL Next will simply copy most of the features (and philosophy) of Mantle, and Mantle will be just become a transitional state..

    Looks like Johan Andersson himself (from Frostbite, one of the main creators/promoters of Mantle) has focused his attention to OpenGL Next:

    https://twitter.com/repi/status/498830555203784705

    http://techreport.com/news/26922/amd...in-opengl-next


    Let's just cross fingers that both APIs do a really good job of reinventing themselves.

    Leave a comment:


  • replied
    As an engine developer I'd be wary to add such technology in general. Do we really need a third API instead of focusing on the ones we have?

    This may be interesting too: http://vr-zone.com/articles/john-car...ins/61108.html

    Leave a comment:


  • replied
    Having it exist isn't a problem, the issue is that it doesn't work on both Nvidia and AMD, since it only works on AMD they have said they don't want to add features that only work on one type of graphics card.

    Leave a comment:


  • replied
    In Frostbite 3, Mantle is an optional feature - so those with AMD GPUs can use it, and those with inferior nVidia cards can still play the game as normal, without really missing out on anything (unlike Gameworks technology, which would drastically change the gameplay experience for AMD vs NV customers)

    Surely a similar option in UE4 would please everyone?

    Leave a comment:


  • replied
    Originally posted by maxGD View Post
    That is not correct. Mantle would work very well with Nvidia GPUs, if Nvidia decides to support it. As opposed to Nvidia's Gameworks closed libraries, which would hurt AMD quite alot, if Epic decides to integrate those into the engine.
    Most users will still be on Win7 when DirectX hits the market. Those will most likely not profit from DirectX 12, but they would from a Mantle API. DirectX 12 is also no option for the Linux and Steam OS users.
    Does Mantle work on Nvidia GPUs? No, it doesn't. It doesn't matter if it could it doesn't and that's how it is at the moment. Epic isn't going to add support for it on the off chance that it might. It's likely that DX12 will work with Windows 7, and since DX11 cards will work with it along with the Xbox One, it's likely going to get support very quickly.

    Leave a comment:


  • replied
    Originally posted by darthviper107 View Post
    Epic has said before they aren't interested in adding support for things that aren't supported on all graphics cards, since Mantle only works on AMD GPUs it's not likely that they are interested in adding it.

    Alternatively, DirectX 12 is supposed to improve performance and is supported on all brands of graphics cards so that will most likely be added very quickly.
    That is not correct. Mantle would work very well with Nvidia GPUs, if Nvidia decides to support it. As opposed to Nvidia's Gameworks closed libraries, which would hurt AMD quite alot, if Epic decides to integrate those into the engine.
    Most users will still be on Win7 when DirectX hits the market. Those will most likely not profit from DirectX 12, but they would from a Mantle API. DirectX 12 is also no option for the Linux and Steam OS users.



    Originally posted by Gigantoad View Post
    Benchmarks I've seen aren't quite as conclusive as you claim. Increase the resolution to 2560x1440 or 4k and the Mantle advantage twindles. Yet those resolutions are exactly what we need in VR. So I cannot really agree with your assessment that Mantle + VR would be a great fit. I'm not saying it would be bad, but is it really worth it to dedicate considerable development time to support it in UE4 at this point before even the main gaming video card manufacturer jumps on the bandwagon? I don't think so.

    *Meh, edited out unneeded stuff.*
    Achieving 90+ fps in 1440p+ in a graphically taxing environment, will, for most users, just be unachievable. Most people will not invest that much money in their PCs. So we will have to learn from the consoles and upscale the resolution to match the native resolution of the rift. You would still benefit from the higher clarity and absence of a screen door effect while maintaining high enough frames rates. That's when Mantle could make a difference!

    Leave a comment:


  • replied
    UE4 is not Open Source, the source code is available to subscribers which is very different. If AMD wanted to add support then that might be an option, but they would have to continually keep it updated and that could be difficult depending on what Epic would have plans to do with the engine.
    Considering the lack of support from the industry and the lack of promotion by AMD I doubt that something like that would happen.

    Leave a comment:

Working...
X