What GPUs do you use for game development with UE4?

Please explain “speed” versus “power”: isn’t a faster card a more powerful one? and the other way around?
What makes one better in gaming and the other better at modeling?

I have run several GTX cards in the past years. My impression was that modeling and CAD (Autocad, Draftsight, Blender modeling) were so easy on all, as to have almost no difference between them. Rendering (Blender Cycles) was a different story, here time was counted in hours and minutes - not screen lag, the cards behaved proportionally to their benchmarks from sites like PassMark GPU tests and inline with their Nvidia naming codes with generations and cuda power.

There can be certain situations where an architecture gives more of and advantage, gaming cards are better at actually rendering things which is more related to speed whereas workstation cards are better at managing large numbers of objects which is more about power.

I’m using the latest intel hd3000, its so fast my eyes can’t even see when i’m playing the demos.

As others have said stay away from workstation cards or any non-gaming cards, they are built for specific software & other purposes. In gaming for example you don’t need to overrun your models with tons of rigs and polygons as you would in real life simulations for machinery, engineering, etc, for gaming is up to you.

Well the last video card that I got excited about was the G-Force 3 and bought it based on the promise that it will make my whites whiter and my brights brighter but was disappointed in what it did for me that day as apposed to what it might do for me five years down the road.

Granted newer cards means faster but how fast is fast enough?

For now my 780 ti gets the job done, as I bought it for shadowplay, but one day I might buy a “used” 1080 on impulse because well it is better. :wink:

In the mean time

I would rather see an effort made to standardize the technology so that we call all make use of hardware rendered physics in Unreal 4 instead of CPU rendered.

The problem with selection is it’s VHS versus Beta all over again and to be called a “true” video game video card should mean the card shares usable features to day with out having to be a Pioneer.

Unless your name happens to be Carmack. :wink:

Most of the people is purely taking Quadro’s skyhigh price seriously but not the driver. In fact, hardware are almost identical, what really matters is the driver and stability accordingly.

In my experience Quadro are far more stable than consumer cards especially on workstations with very intensive loads. For example, in IDC industry, there’s a massive budget difference between stability 99%, 99.9% and 99.99%.

End of the day it still depends on your risk assessment, i.e. disruption. If I’m working on DIY project at home I wouldn’t mind it’s Quadro, GTX or Intel HD. But if there’s a deadline tomorrow for a $5M project, I wouldn’t put myself in the danger of having driver unloading frequently.

It’s a trade-off - you are not purely buying the performance. Simple fact - if you don’t see the benefit of using Quadro, then just ignore it. Unfortunately for us in AEC industry we do need to pay the sky high price and there’s no alternative.

As per AMD… lol

Okay people say GTX 1080s 1080 Ti are very powerful when comes to gaming and development as well i am not nocking those cards, Nvidia are going to release newer versions of Quadro series called Quadro RTX 5000, RTX 6000 & the RTX 8000. The Quadro edition RTX 5000 to 8000 are designed for game developers 2 of the RTX 2080 Ti are equivalent to a price of a Quadro RTX 5000 also can play gaming as well. I wondering if these kinds of GPUs are worth the money, also if bridge 2 Quadro RTX 5000 together with a NVLink they double in GPU without one being the master and the other GPU Being the slave GTX GPUs do that Quadro RTX 5000 to 8000 use a NVLink so they can communicate with each other, no GPU slaves anymore, i thought about using Quadro RTX 5000 for UE4, there not released yet.

I was watching someone one YouTube using a Quadro GV100 GPU to do hairs on a wolf he got up to a hair could of well over 10 million.

The RTX 5000 / 8000 are not suitable for game development, they are designed with CAD and high resolution assets and non real time applications in mind. For general game development, there is no real advantage in either over the 2080Ti (and depending on firmware, the 2080Ti may well perform better for real time applications).

Quadro can outperform Geforce cards, but at a much much higher cost. If you have a budget, and you are looking at a Geforce card and a Quadro card that are around the same price, the Geforce card will be better.

As far as NVLink goes, that is something that has to be implemented within the software you’re using, so UE4 would have to support that feature.

I just stick to the mainstream Nvidia consumer/gaming cards made by ASUS. As much fun as trying to break the mould is sticking with the common framework far outweighs any benefit you’ll get from other cards (in my experience).

Would rather risk spending an extra $200-$300 than spend months (or years) trying to fight with a framework that’s not intended for gaming. Never had a single issue with the ASUS Nvidia cards and will probably use them until I’m old and rickety. Still got an old ASUS Geforce 3 from 2003 kicking around that works.

It really depends. With the given example of the RTX 5000 and the 2080Ti, the two have very similar paper specs (the Quadro is slightly better and has 16GB memory vs the 2080Ti’s 11GB) - I suspect game performance between the two is likely on parity, but the RTX5000 is going to cost you £2200 to the 2080Ti’s £1100.

That’s exactly what I said, on the same price point the Geforce cards are better, but the Quadro cards can get more power/memory but cost a lot more for that.

Well as a “game developer” the end user requirements should be taken into consideration and anything above that just to make life easier

https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

Based on Valves numbers Nvidia owns just over 3/4 of the for video game use market with directx12 cards trending the highest over the past 4 months with 1060 1050ti 1050 750ti showing positive trends for the same 4 months.

As for Quadro cards they don’t even break into the survey list and if the reviews are to believed are better suited to fulfill specialized needs.

For my personal use I went with the 1050ti as a balance between price versus performance and have yet felt needing more as to FPS performance that really does not matter as much in 2018. Next upgrade will only come once the promise of true baked lightmas ray tracing and bucket fill rendering become a reality or when Vray comes stock in UE4. :smiley:

For development, get the best you can afford. For testing though I think it’s useful to have a machine with your targeted minimum specifications so that you can test during development to get performance working well.

Hello everyone i found a You tube video about the Nvidia Quadro RTX 6000 i give a share code, when watching and the guy shows a Porsche look at the type of GPU and the game development company, hear is the code, car visuals looks really photorealism

It looks great, however A) Not really a game application, and B) you can see the frame rate is pretty poor. Quadro cards are for accuracy, Geforce cards are for speed. You want speed for games.