Facing a dilema (how important is GPU?)

Hello, as you probably noticed all the GPU prices are getting mad right now. So i’m considering upgrading my GPU a 2060 to a 3070. before the prices get really out of hand. the thing is right now at the begining of my game, an open world, where i populated only a little bit, i have no need for a GPU upgrade. my proc does all the job flawlessly, and the little 2060 follows nicely. whatever it is loading the editor, compiling the shaders, or loading the level i’m working on. so my dilema is simple should i go now for a GPU upgrade, or can i do without it. i’m pretty sure the GPU market will remain the same until at least 2023, and will very likely worsen. so this is why i’m considering that upgrade. this is my first game, so there are still shadowed areas. at this point my config runs very well in UE4, Blender, and only slight lags i got were with iclone. but nothing really bothering. the config is 2060 + 3900X + 32 gig high end ram

1 Like

Unless you find a modern GPU at RRP, I say hold on with splashing out. 2060 is still no slouch. Also, the target audience of your game will not be equipped with cutting edge devices. It will be easier for you to playtest and identify bottlenecks / performance hogs at a glance without laborious profiling.

https://store.steampowered.com/hwsurvey

Hardly anyone owns 3060 or up and, as you expect, this is unlikely to change much in the next year or so. While you can buy Ryzen CPUs right now (at least in Europe), the scalper-free GPU market is non existent.

3 Likes

this is an interesting link. i thought with the confinements the nvidia 3x were more spread over. so according to this, actual shortage would be caused by miners mainly?

The market was supposed to flooded with used GPUs after China yet again made crypto operations even more illegal. Somehow that did not happen. Someone else someplace else picked up the slack?


I’ve heard rumours about madmen actually going to brick and mortar shops, buying units off the shelf, like in the 90s. Truly desperate times we live in.

2 Likes

If you’re planning on releasing a game for consumers, consider them too. If you have to upgrade your machine for your game to get good performance then that might mean that the minimum requirements to run your game may be too high.

2 Likes

I forgot to say that i also own a mi-range test PC. a 3600X + 1070 + 16 gigs. but yes your argument darthviper07 totally stands. there’s no point of having a GC that allows you to develop fast, if you have to come back incessantly to what you’ve done. This is something i have in mind since the beginning. Thanks a lot for the help. the link was very useful

That isn’t exactly true; a debug build is a lot slower than a cooked release, so improving your specs far above the target specs is very helpful for cutting down development wait times. You should optimize the release build to work on lower end hardware than what you used to make it, but developing on the same hardware you expect the player to use is going to be painfully slow.

Basically if you start development with X hardware, then 3 years later at release you should set the ultra spec recommendations for that hardware since it would be more available to the public by then and you know it will handle the highest level settings you’ve included from your own experience using it while unoptimized.

That being said, you can of course work with insufficient quality hardware to run the game smoothly even if it were optimized, and still make a game, but it will be a lot more taking naps and watching youtube videos while things code and shaders compile, and then running the previews with lighting turned off etc.

Don’t mistake me; I’m not saying a game should require or even be designed for high end systems or have advanced graphics at all. I’m just pointing out that development requires a lot more computation than merely playing the game does, no matter what type of game you are making.

1 Like

Having a powerful machine for development is great, but what I was saying is that if that’s the only machine you’re working with and you have to upgrade it for your game to work then that may be a sign that the minimum requirements of your game are too high.
He said he has a lower-spec machine for testing though so he should be good.

1 Like

well at this point i have no prob. avoiding issues is pretty simple. just test on lower machine often. if there’s a prob you should be able to locate it fast. about the shaders, it’s not a gpu thing. my 3900X is a shader compiling ace. as i already said in another post i can compile 3 or 4k shaders in less than 1 min (or so). regarding the lighting my map is too big for lightmass. i have to go full dynamic. high end rigs have certainly defenders, but personnally i prefer a mid-range rig. it’s a choice i’ve done. GPU prices are really insane right now. if i choose a 3070, it’s about 1000€ non less. frankly i don’t think it’s justified

1 Like

It’s hard to make predictions, especially about the future.

I would expect that, not only do we have a GeForce 4000 series in 2023, its equivalent cards to today’s 3000 series will sell for fewer real dollars then than the 3000 series does now. This is because development is ongoing, AND because the world factors contributing to the “chip crunch” are being solved with multiple companies building more chip fabs, some of which open 2022/2023. (They’ve of course been under construction for a long time!)

That being said – developing on a 2060 means that you won’t accidentally build a game that won’t run on cards below a 3090. Even a 2060 is pretty fast, compared to intel integrated graphics, so you don’t necessarily have to upgrade.
If you don’t already have a very fast M.2 SSD in your computer, then that’s probably a much more worthwhile upgrade, for example.

1 Like

Jensen Huang is pretty certain situation will remain grim in 2022/2023. But then… how didn’t their clairvoyance predict shortages in the first place, eh? One can’t possible blame everything on Covid + Crypto combo.

There’s already announcements of series 4 coming out second or third quarter next year.

In all likelihood the new series will also prevent mining.

Sadly the prices are unlikely to drop.
The market is void of products at outrageous prices.
And this has shown that people are willing to pay insane prices for essentially nothing (a 3060 at around 1k? Are you f*ing kidding me?)

It’s not as much about the “shortage” which does exist across everything silicon…

If you wanted a 3090 badly, you can buy a pre-built for around $4k and get one.
The markup on it is still around 20%.
The parts you don’t need you probably won’t be able to sell back either since all of those come with c rap memory and mobos.

On the other hand, if you want a gsync ultimate 4k 1440 hdr monitor you are just SOL.
With only a few models released being a year old (predator x27/28/32 etc, and rare asus ones) the current market has Used for $2k.
If you thought the gfx market was though…

I suggest everyone try to get in line with EVGA when possible. Look it up. Can’t enter queues right now. But hopefully after the next batch it will be allowed again.
Wait a year. Get your GPU at msrp…

1 Like

i already have a good SATA SSD. not sure a m2 would improve anything?

Direct Storage is the name of the game. Have a look at this thread:

And the links at the very bottom if you want to dive a bit deeper into this.

“Hello, as you probably noticed all the GPU prices are getting mad right now” Yeah that is ridiculous, 1000+ dollars for modest cards.

You mention that what you have can actually run what you have made in UE so far. I think you should be aight. This situation is perhaps one of those things like firmware upgrades, is there really a reason to upgrade? A functionality situation?..or more fps or resolution offered by something else?

Does your name say fergi or Forengi?
That statement is the equivalent quoting the rules of acquisition…

sata 3 is a max bus speed of 600MB/s
sata 2 is a max of 300MB/s
NVME 2000MB per second

so… still think the right m.2 wouldn’t improve performance?

You gotta bump those (2018) numbers, these are rookie numbers. Sustained read of 7GB/s and writes around 5GB/s is what defines new drives.

That’s an order of magnitude faster than SATA 3. Also with Direct Storage, GPU gets involved.

I was being a bit of a forengi myself.

I can’t really get unreal to write to RTs that fast anyway. So the benefit for development or a final exe seems somewhat marginal after you hit the computing speed limit.

That said, upgrading hardware has the potential to improve on that without need to upgrade the ssd yet again.
So instead of being a forengi, if you spend $200 on a 2TB m.2 with good numbers/ratings/reviews, you’ll probably keep using it for a few years.

Many nvme drive require drivers from the manufacturer to work right.
Intel ones are particularly nagging. To the point you BSOD randomly without drivers installed.