Recently AMD show her Fury X with 4GB HBM. A really BIG monster. The HBM main feature its increased the memory bandwidth.
I’m not a technical guy, but more bandwidth of course seems very good for fast load big textures in memory, that great but if the game industry going to move to use more higher resolution textures next year taking advantage of the next graphics card with 6/8GB ram, 4GB seems going to limit alot the number of this textures in memory.
what do you think guys? enough for first generation or the standardization of 6/8GB of GDDR5 in other graphics card can result in a bottleneck in the Fury X in the nearly future ? something like kite epic demo with lots of 4k and 8k textures that the tech demo run on Titan X with 12gb.
or i miss any technical detail? or i have any wrong concept?
4GB is enough for literally right now, where 99% of us are still using 1080p/1440p monitors, and just barely.
From benchmarks I’ve seethe Fury X is consistently beating the 980ti/Titan X in benchmarks at these resolutions, but as soon as you bump it to 4k/8k resolutions the absolute limit of 4GB starts to really become a bottleneck compared to 6/12GB, next generation of AMD cards will likely have more memory due to this, but it’s not a barrier for most gamers right this moment.
And you can always SLI/CrossFire.
Game developers wouldn’t be creating games as heavy as Kite Demo. There’s no market for such thing so no worries for now.
Our games and graphics will improve, as will our optimization, and ability to fully use the resources available on our hardware. I’m no expert, but DX12 is supposed to improve GPU performance by quite a bit while also taking a load off of the CPU. The Kite Demo seems a bit much for our current technology, and it will be a few years before all games are getting those resolutions in textures. Plus, you have to remember that having more HD textures also requires better cameras, more work, and more time. So even though the hardware can better support higher resolution graphics, it is still more of a hassle than lower resolution graphics. I think that any developer should have at least a GPU one step above what the end user/player should be required to have, so if you are developing a game that will require a 10GB GPU, you’ll probably want a 12GB yourself.
Just remember that SLI\Crossfire will NOT increase your VRAM.
It will use the max VRAM of your lowest card in case of ATI (eg. if you have a 2GB and a 4GB gpu, you will be capped at 2GB) or it will need the same ammount of VRAM in case of NVidia.
So do you guys think buying a 970 is a waste of money ATM ?
For my home PC, which I also work there too, I’m going to buy a 970 in a couple days, but whole this vram issue makes me wonder.