gtx 970 vram problem(so which gpu should i get)

, please stop being such a Nvidia fanboy here. I don’t want to offend you (I am a great fan of your ocean simulation project) but what you write here is really just “fanboyism”.

R9 290X 290W, GTX 970 145W. It’s exactly 2x more power. While price difference between 290X and 970 is 200$. You want to calculate how long you have to own the card to spend 200$ just on power for the graphics card? The difference in idle is negligible, and that’s exactly the state the GPUs use the most.

Have you even read one single news article about the problem? Whether you use 4K or not is not important, even in Full HD games will use 4GB RAM in 1 or 2 years (And some already do now).
And the difference in framerate is incredible.

Frametimes_UHD_GTX_980-pcgh.png

The screenshot compares a GTX 980 (down at specs of 970) and a real 970. The 970 has a lot more irregulary frametimes, and “irregularity” is exactly what the brain feels as stutter. A regular framerate is much more important than a higher average framerate.
If 4GB of RAM is used, the average fps are one third less in the test. One third, means 33%. Not “1% to 3%”. And additionally the fps are a lot more irregular (more spikes) so you have the feeling the fps are actually down a lot more than “just” 30%.

And in general, it’s clear Nvidia knew about the problem but they intentionally lied their customers to sell more graphic cards. The logical consequence for everyone here should be to not continue buying Nvidia cards (if they are not a lot better than AMDs). It’s not the first time Nvidia lied to their customers like , AMD never did it.

thanks everybody… I will soon go get those beasts

Now I’m not trying to be a troll here but it’ll probably sound like it, you’ve told him not to be a fanboy and then proceeded to do it yourself but from an AMD point of view? AMD never lied to there customers like NVIDIA have, that’s not an impartial view is it?

Well I used to have a AMD athlon xp processor, that whole processor was a lie :slight_smile:

Also I should probably add, I played Shadows of Mordor on a GTX 970 the other day, all settings on Ultra, ran smooth as you like getting just under 100 frames per second. Feel free to send your rubbish video my way if you don’t want it :wink:

No worries! Not offended at all. :slight_smile:

But where are you shopping for a 290x? From what I have seen they are either identically priced or the 970 is slightly cheaper:

ASUS 290x - http://www.ncix.com/detail/asus-radeon-r9-290x-directcu-48-93996-1509.htm - $499 regular price
Gigabyte 290x - http://www.ncix.com/detail/gigabyte-radeon-r9-290x-oc-95-93519-1120.htm - $459 regular price

ASUS GTX 970 - http://www.ncix.com/detail/asus-geforce-gtx-970-strix-0b-102345-1120.htm - $449 regular price
Gigabyte GTX 970 - http://www.ncix.com/detail/gigabyte-geforce-gtx-970-oc-6d-102012.htm - $449 regular price

*All prices noted above are in Canadian dollars

I have read many articles on actually, each of them came to the same conclusion in the end. Can you provide a link to the article where you got those charts from? I’d be interested in taking a look at it!

Those are frame-times not frames per second. A 30% increase in frame-time for a single frame will not create a 30% FPS drop. It is a sizable difference in that chart for sure, but it isn’t as bad as it looks. I would really like to read article providing the graphs for and see their method of testing.

I don’t see why slip-up should be a reason for people to not buy Nvidia cards anymore. They are putting a ton of effort into advancing computer graphics (both in hardware & software), and their cards are extremely reliable, fast, and efficient. I’ll leave it up to others to decide what brand they choose, but in the end card still performs as well as it did before news came out.

As far as AMD never making a mistake, I highly doubt that is true, but I can’t really comment on that further. I’ve never used a GPU or CPU made by AMD. That’s just my personal preference though, not because I have something against them.

He meantioned the prices.

The chart is from a german article, that’s why I have not linked it. There it is: Golem.de: IT-News für Profis

30% more average frame times is equal to 30% less average fps.

again sounds as if you were a nvidia employee. They just lied to their customers, and if someone do the logical consequence is to not support it. And the only way customers can say "no, we don’t want you to do things like " is by stop buying the products.

You never used a GPU from AMD, but your personal preference is Nvidia. Why, if you never tried AMD?
I read a thread in a german forum where people discussed whether AMD did ever lied to their customers, and there nobody remembered any example. If they did, I would be glad to hear about it.

It’s not average frame times though, it’s spikes in the frame time going 30% higher. 5fps taking 30% longer out of 30fps will not equate to a 30% drop in overall FPS. To get a 30% drop in FPS would require every frame-time to be 30% more, not just a few of the frames.

Sorry I mis-typed that last bit, I have used AMD GPU/CPU, just never purchased them for my home PC. Again, that’s just my personal preference, nothing against AMD I just prefer Intel/Nvidia.

But anyways we’re going off topic here. I totally agree it was a huge mistake on their part, and they will need to regain trust. But there just isn’t enough data out there yet. They will be (and are) doing the tests now using multiple games, configurations, and conditions on sites such as Tom’s Hardware (which is one of the most reputable sources for benchmarking). Until then I am reserving my judgement, if I’m wrong I will come back here and gladly admit it, but so far the data is inconclusive. :slight_smile:

Look at the diagram again. In the top right corner you see that 980@970 Specs has average fps of 15.1 while the real 970 has average fps of 11.2 . It is a 30% drop in “overall average fps”. The average frame takes ~80ms to render on the 970 while it only takes ~55ms on the 980@970. Don’t you see it?

After finding out about , I am still enjoying my 970. After all what am I going to do think that $200 is totally worth the **** of a 980? No. The 970 is still the best cost for performance even if **** is a bit off.

Sell me your “old” GTX 970s, I’ll pay 10c to a $1. :wink:

Seriously, why would anyone want to buy AMD R9 290x which is 2+ years old, consumes 2x as much power than GTX 970 and uses lousy AMD drivers. Just on that last fact (lousy AMD drivers) the choice should be obvious.

Not to mention that UE4 will soon have VXGI and that won’t run with AMD, nor older series of GTX either.

the R9 290x is not even 2 years old it’s only a little over a year old, AMD has good drivers, and their GCN cards are holding up much better over time than nvidia cards. Especially in new games where nvidia drivers seem to only be optimized for the 9xx series. The 6xx and 7xx have lost a lot of performance in new games because of .

Sure nvidia has it’s on branches where you can download there nvidia specific stuff, but epic won’t be adding it to base UE4 as it’s not crossplatform, and anyone looking to release a game to the general public isn’t going to make any nvidia specific stuff key to their game as it just makes them less money.

I actually find it interesting since UE4 is about cross platform and giving lots of access to developers, where as nvidia always wants to lock their stuff down, even if something can work on competition hardware they will lock it out, and will intentionally lower performance for everyone if it hurts the competition worse.

You are right that R9 290x is only a year and change old. Never the less, it’s still a loud and power hungry hog, regardless of it’s current price drop in wake of GTX 970.

But I believe you got the other stuff wrong…

NVIDIA GameWorks, as far as I could read from the thread here on the forum, is cross-platform. The is that AMD does not want to use NVIDIA’s implementations/APIs because they are competitors. AMD is really hurting their users that way, yet their users keep blaming the other vendor for it. Or I suppose they could blame capitalism for it, the competition part. :slight_smile:

AMD never had good drivers, not before, not now. Heck, I used to know some engineers while ATI was still up here in Markham and they seemed like smart guys. Hardware was decent yet software could somehow never catch up. Sure, everyone’s experiences vary, and mine was such that involved entire labs of workstations using ATI cards so it all got taxed pretty well.

Epic is a partner with NVIDIA and it’s likely we’ll see GameWorks integrated into epic/ soon. Whether AMD users like that or not, well, they can go complain to AMD about it not NVIDIA or Epic, as I mentioned above. :slight_smile:

the R9 290x is significantly weaker than the 970 in terms in memory clock speed. The R9 290x just simply can’t come close the 970x because of . Rather in benchmarks it compares much more identical to the GTX 960 a significantly lower card with only 2 gigs of VRAM and a hundred dollar less than the R9 290x.

There is really absolutely no reason to buy modern high end AMD cards. They’ve been ******** up on their recent hardware too heavily that it isn’t a comparison between the companies at point.

Where the heck do you guys come up with painfully incorrect information. It’s the age of the internet it’s easy to look up and correct. 290x has much more memory bandwidth than the 970. The 970 has 192 GB/s + 24 GB/s while the 290 and 290x both have 320 GB/s. While in games the 290x and 970 have similar performance the 290x winning some the 970 winning some.

The 960 doesn’t come close to the 290x as reviews show, the 960 is a terrible card for the price unless all you care about is power consumption. It’s barely faster than the 760. You can get the 280 for around 150 which has similar performance to the 960, and the 280 has 3gb of ram so gives much better performance after the 960 runs out of it’s 2gb of vram.

You might wish to invest in reading glasses prior to attempting to say information is inaccurate. Would do you well since then you won’t be arguing against the wrong thing. I did, after all, not mention the memory bandwidth. I said the memory clock speed which is 1,753 MHz in the GTX 960 compared with 1,250 MHz in the R9 290x; the core clock speed also comes out ahead at 1,127 MHz compared with 1,000 MHz.

There are some advantages to the R9 over the 960, but when you get down to all the passmark scores and frame rate tests the R9 290x only has a slight advantage over the 960 not at all worth the significant price difference in my opinion. Yet most certainly the 290x is no where even close to competing with the 970.

Once again most of what you said here is just plainly false.

The reference cooler is loud for the 290x, thankfully most cards you would buy today have a custom cooler which both keep the card cool and are quiet. Not just quiet but quieter than the 970. The power difference is also overblown.

AMD had drivers that weren’t good in the past, but that is long gone and today they have great drivers now. Heck recently we have seen some problems and bad drivers from nvidia, such as bad color over hdmi, or lack of optimization of 7xx series after the 9xx series came out. Also dual card quality is better for AMD on the newest cards than nvidia sli.

I would love to see nvidia change it’s tune on gameworks( and other closed software ) and open it up, but that would be a major change for nvidia. NVidia in the past has actively refused updates from AMD to make their code run better on AMD hardware. Plus we have seen nvidia make performance worse on both their and AMD hardware if it hurts the AMD hardware more.

If you want to see gameworks you should check out all the releases of gameworks titles and how they run on PC. Basically terrible for anything but the 9xx series on release, then AMD getting performance bump after driver update for that game.

Epic has stated multiple times that they won’t be adding gameworks to UE4 due to keeping UE4 globally compatible.

Memory clock speed is meaningless, what matters is bandwidth. The 960 is terrible, and a joke compared to a 290x. 1753 Mhz at 128 bit vs 1250 at 512 bit, the 960 memory bandwidth is a joke, the 290x has nearly THREE TIMES the bandwidth of the 960. Heck on the 3xx series will have even slower memory clock speed yet MUCH higher bandwidth and lower latency.

If you want to learn about computer hardware there is lots of places you can go, but considering all you are doing is simply stating clock speed shows you don’t have knowledge on the subject.

http://www.guru3d.com/articles_pages/gigabyte_geforce_gtx_960_windforce_2x_review,16.html

There is not a single game in there where the 960 comes remotely close to the 290 or 290x, with the 290x and 970 trading blows, both winning some

or look here

If they were to use a custom cooled card the 290x does even better.

Are you an AMD user, right? That’s the only conclusion I can draw from our current conversation so far. :slight_smile:

Epic might have stated that GameWorks won’t be added to UE4 before but that’s not what their users want, and let’s not forget who’s the real boss here (remember the capitalism part about competition, profits and sales).

You seem to think that 2015 is going to stay around for 3-4 years or something? NVIDIA GM2xx came out last year and it’s probably first in the series of chipsets which will do better and better with GameWorks, among other things. Progress waits for nobody and someone who still has a GTX 580 in 2016 will be left behind (they already are, and so are GTX 6xx owners).

Technology improves dramatically, but some people think that vendors should stay behind and support their ag-ing/-ed products. That’s not how capitalism works - case in point Apple who forces their users to perpetually upgrade their hardware and software every 8-12 months. NVIDIA and AMD are no different, they have boards of directors and shareholders who demand ever increasing profits, and companies leverage every (sometimes dirty) trick in the book to achieve those.

Coming back to UE4, we are not going to see as many titles with it yet for another year (or more), but when they do come out they will definitely not run well on GTX 580 or 680, or AMD R9 290x, because these cards would’ve been 3+ years old and no one makes future games for old tech. It does not the game well, nor the engine tech, and neither keeps the longevity of the game. Everyone wants their games to last years ahead and they need to compete with other games, equally good and amazing.

Anyway, I hope I got my point across without going to far out of the thread subject. We can have another thread where we can debate economics of the hardware market and tricks vendors do. :slight_smile:

Epic doesn’t have to do anything with gameworks, NV has their own branches so if you want to use it you can.

I currently have an AMD graphics card, have used NVidia in the past, and will most likely pickup next gen both AMD and a used NVidia for testing purposes. I am just as critical of AMD as I am of NVidia. What I don’t like to see is incorrect information continue to be spread. Such as the 290x is loud, when it purely has to do with the cooler there are lots of quiet 290’s. Or that AMD has bad drivers which is plainly not the case.

The R9 280 and 280x are both basically 3 years old as they are just rebranded 7950, and 7970.

You don’t want your game to only run well on the newest tech as it seriously hurts your market, you have scalability so your game can run well on a wide verity of hardware without making a major change to the actual game it’s self.

Edit: Everyone knows WHY these companies do these type of things, it doesn’t mean we have to support it. They aren’t are friends, they are just trying to make money off of us. We should be supporting what we WANT them to be doing rather than what just makes them more profit.

Sorry if I’m late to the party on one: I have a 980 at home and a 970 at work. Difference is - there is no difference, everything runs almost identically. Any minor differences can probably be traced back to the rest of the hardware being different.

One way or the other, the 900 series cards are utter monsters, and seemingly impossible to get either of them hot too! You’re pretty much clutching at straws when it comes to comparing them, they’ll both last as long as each other and give you the same amount of smiles!

I like to also point out that Directx 12 for Windows 10 will have full Directx 12 support. Also Unreal Engine 4 is one of the engines that Microsoft is using to test Directx 12. Nvidia and Microsoft is also advertising that Maxwell cards are the first to have full Directx 12 support.