Download

Best video card for unreal engine 4 development?

Hello guys, I just started using unreal engine 4 and I got some questions about what video card to use. Right now I have a AMD radeon HD 7700.

So I would like to know witch brand is beter for developing in unreal engine 4 NVIDIA or ATI/AMD ?

I also would like to know what are the important things to look for in a video card when buying one for developing in unreal engine 4.

I’m thinking of buying a GeForce GTX 770 is this a good card for unreal engine 4 development?

thank in advance,
Jason.

Both brands are supported and there’s no specific features that works better on one or the other. I do prefer Nvidia, but the latest AMD cards are pretty good performance.
On the computers I’m on currently, I’m using a GTX 560, GTX 660, and a GTX 770 and they all work great.

I would recommend you to take a look at this thread: https://forums.unrealengine.com/showthread.php?20643-Official-Hardware-Performance-Survey There you can find many PC’s with different specs and gpu’s

I personally use the nvidia gtx 600 ti -> I get a pretty good framerate with it :slight_smile:

Both AMD and Intel can do the work, but as an AMD budget user if you have the money for an Intel CPU/Mobo get it. Your GPU is fine if your CPU is a AMD FX™-4100 Quad 3.60GHz I suggest you replacing it or you’ll be facing lag issues.

AMD FX™-4100 Quad 3.60GHz -CPU
AMD Radeon HD 7970 - GPU
8 RAM
1TB - HDD
Windows 8

Hummm

In my opinion video cards are getting into that area of fast is fast enough and things are going to get harder for card manufactures to create a need for an upgrade that dose not included value adding.

I just bought a 780ti because I was in the market for a video capture card and the ti has one built in called ShadowPlay.

As for performance, as in moving polygons, it’s a problem that was solved a long time ago and more about things like textures, lighting, and fill rates (draw calls) so yeah a 770 will do the job as thosue are thinks under the control of the developer and not a limitation of hardware (and ShadowPlay to boot)

https://www.youtube.com/watch?v=IQ_BWblteQU&list=UU547Klqrd3y4AUij2Au4LTw

To prove the point here is a test I did way back. (note the date)

At the time I was using an 8800gxt 512 on a six year old CPU.

Well, I prefer Nvidia’s. Also, remember that nvidia has all those gameworks technologies etc. which may get a 3rd party unreal engine 4 implementation, and with a nvidia you’ll get better performance.

With this being my very first gaming GPU (I was only running my 2600k and stuff lol, i was lame), I chose a XFX Radeon HD 7850 2GB OC only to find out that the R9 270X came out a couple weeks later -_-, but I love my 7850 and it’s holding up great. :smiley: My next card will probably be from NVIDIA so I can get a taste of both cakes. I have no problem maxing out games. I can play Battlefield 4 maxed in 1080p with no AA with 60+ FPS on certain maps and 50 - 60 in DirectX 11 on others. I haven’t started using Mantle again becuase I was noticing a couple weird artifacts but that was a couple months ago. Hopefully, it’s enough to run (if not max out) Dead Rising 3, Dying Light, Metal Gear Solid 5, and at least get a playable framerate if not 60 FPS in The Witcher 3 on some tweaked settings :smiley:

Epic have stated they are GPU agnostic, and will only support tech that works with both cards. (E.g. In Unreal, PhysX gets rendered on the CPU, even though it could be done on an Nvidia GPU far quicker). This prevents the GPU manufacturers trying to bring out proprietary tech, and trying to screw over the people who don’t have their cards.)

In other words, there is no better GPU manufacturer for Unreal. Just get the best you can for your budget. :slight_smile:

What would be cool is to have some GPU support just for the use of the editor. So that we can have better speeds while creating, not for the final product I mean. For example I am getting a 780 Ti I think, but I’d like to know that it’s worth it.

Well, still, there propably will be third party implementations made, and the truth is, nvidia has made much more technology that gets accelerated by therir card. Epic will make all implementations manufacturer agnostic, but people will still make implementations of nvidia tech.

It is worth it. At the moment it’s the best gpu (except to the titan z).

I think how powerful a graphics card you need depends on your expected usage of UE4. In my case, I am a beginner still learning so I can get by with my GTX 650 Ti Boost no problem. Even in higher end stuff - like tech demos - I still have no problems viewing those through the editor with the graphics card I have. So going with the recommended GPU from the UE4 page would be fine for most people, unless you are planning to do something really big that requires a lot more GPU power for testing and developing in the editor.

I develop on a few PCs and two MacBook Pros. My PC has a 4790K w/GTX 980 G1 4GB VRAM, and MacBook Pro 15 has the built in Intel Iris Pro HD card 1.5GB VRAM and i7 2.8 Ghz. The PC is quicker, and smoother FPS, especially in the in game preview window. With PC I can get 60-90 FPS, while Mac stays at 40 FPS, with a scene that’s about 386,000-400k tris. Also if you have 2-3 blueprints open the MBP goes down to 15-20 fps, while the PC w/ GTX 980 maintains over 60! The MacBook is still very usable, albeit, the performance is lower when it goes full screen (retina is 2560 x 1600), or multiple BP are open. Since I develop for Mobile I keep viewport relatively small.

On a side note, I have an AMD machine with crossfire 7850K APU and a r7-240. The fully built app (not in editor), runs at barely 25-30 FPS @ 720p. I also have an 4160T i3 running a GTX 960 w/ 4GB RAM. It runs 1080p at a solid 30-40 FPS on a standalone build. From machine to machine it completely varies. Also varies from between in game editor preview/selected viewport, and external app preview windows. I’m using Mobile HDR too, and also the apps aren’t intended to be run at this high resolution, these tests are sort of moot.

Worst scenario I have is my i5 MacBook Pro 13" non retina (2012) i5 with built in Intel 4000HD display. It’s almost unusable! Even with a tiny viewport preview, let alone full screen.

Just wanted to put my experiences out there, though it may feel like incessant rambling :slight_smile:

These were all done using Unreal 4.12.5. Bottom line, it all depends on what you’re developing for. HDR/LDR Mobile, resolutions, etc are all extraneous factors. I used my Intel Iris Pro 1536 on my Retina MBP, which runs fine in most scenarios for Mobile development. If I were to code a complex 1080p full screen game, with full particle effects, bloom, post processing etc, I wouldn’t go for anything less than a GTX 960,970,980, or the AMD comparable. Also > 2-4GB of VRAM if your project is AAA, for consoles, stand alone window or Mac.

I use radeon r9 380 (4gb) it won’t get you 60fps on a heavy project but it’s a very reasonable budget card, my game is very graphics heavy and it runs at 30fps average on it, probably cause that’s what I’m target. But if you want something beefy then I suggest you go for a GTX 980, or maybe… GTX Titan??