Multiple Graphics Adapters (related to 4.1 QFE / very low fps rendering)

Multiple Graphics Adapters

Some machines, most commonly laptops, contain multiple graphics adapters. Selecting the correct adapter can be more complicated than you might think. When we released the 4.1 update some users noticed a severe decrease in performance. It turned out we were selecting the wrong adapter in some situations on Windows 8. Read further if you want the technical details and a workaround. If you just want to get it fixed you should download the QFE.

When we released UnrealEngine4 to everyone that is willing to spend $19 we knew we quickly get a lot of testing on all kinds of hardware. Some users reported bad performance and we found multiple causes. Here we want to talk about Unreal Engine specifics so we remove where the user has a slow graphic card and/or a slow CPU, low memory or other hardware issues like running on battery. We also would like to skip general performance comparisons with other engines, shipped games or other techniques. We will continue to optimize the engine and adapt to the wide range of hardware out there. Many design decisions we do today cost more performance but offer other benefits like better quality or lower performance cost when used extensively (e.g. deferred shading is slower for the single light case but with complex geometry and many lights it’s much more efficient).

NVIDIA Optimus
We found that in some cases the users had multiple graphics cards and the system picked the wrong one. The engine is optimized for high end visuals so the right one would be the fastest one. Many laptops have multiple graphics card and by default an application gets the more energy efficient but slower one. In some cases this choice can be adjusted in the windows graphics card settings. We already applied a hint (NVIDIA Optimus specific) to the system that out application should be considered high performance and it should pick the faster card. This hint wasn’t fully working and that was fixed in Unreal Engine 4.1. If the user explicitly overrides this behavior in the driver we cannot do anything about it. In that case the user wants to save battery or energy and has to accept a slower frame rate. The specified hint needs to be put into each executable so it’s possible a different engine executable doesn’t show the same behavior. We also have ways of compiling the engine in different ways and all that can affect the result. Note that Optimus seems to hide the second adapter. In cases where we see multiple adapters we have to apply a different solution (see next paragraph).
More information on Optimus: NVIDIA Optimus Programming Guide | NVIDIA Developer

Multiple graphics card adapters
Not every machine that has multiple graphics cards uses the NVIDIA Optimus system. In that case we would like to pick the faster card. How to pick the faster card? We could run a benchmark. A complex system (that could fail) would need to developed, the user would have to wait, we would store the result to avoid waiting the next time and we would have to detect when hardware changes to run it again. Simpler is to implement a good heuristic. We would like to detect the integrated (on board) graphics cards and rather use a second dedicated (slot) graphics card as they are usually faster. For Unreal Engine 4.1 we implemented that heuristic and it worked fine on the machine we tested. After release some users reported severe performance loss and some cases have been caused by his heuristic. If the user had multiple adapters it avoided Intel and picked the next one. Turns out on Windows 8 there is a software rendering adapter that is enumerated after the Intel hardware one. That one is way slower and therefore not a good choice.
We improved the heuristic and the changes are currently being tested or might have been released already. The software device can explicitly detected though a specific ID and it has no display outputs. We also improved the heuristic to also look for a NVIDIA or AMD graphic card.

How to check if Unreal Engine runs on the wrong adapter
Looking at the latest log file (Saved/Logs) you can see if it picked the wrong adapter:
Found D3D11 adapter 0: Intel(R) HD Graphics Family
Adapter has 0MB of dedicated video memory, 0MB of dedicated system
memory, and 1792MB of shared system memory
Found D3D11 adapter 1: Microsoft Basic Render Driver
Adapter has 0MB of dedicated video memory, 0MB of dedicated system
memory, and 256MB of shared system memory
LogD3D11RHI: Chosen D3D11 Adapter Id = 0

How to override the adapter choice / workaround the wrong decision
For most low level features in Unreal Engine we use console variables to give user control. For this feature the console variable is called “r.GraphicsAdapter”. You can type the console variable name into the OutputLog or in the game console to get it’s current value and the help using “r.GraphicsAdapter ?”. This is the console variable help:
User request to pick a specific graphics adapter
(e.g. when using a integrated graphics card with a descrete one)
At the moment this only works on Direct3D 11.
-2: Take the first one that fulfills the criteria
-1: Favour non integrated because there are usually faster (default)
0: Adpater #0
1: Adpater #1, …

You cannot change this setting at runtime like most other console variables but you can add a line to the file consolevariables.ini:
r.GraphicsAdapter = -2
Note that to set a console variable there you have to add a “=”. Don’t add any characters in front of it. The “;” you might see there is to comment the line so that is has no effect.
You can check if your change had effect looking at the log again. You can type the console variable name into the OutputLog or in the game console to get its current value.

How about giving the user a choice
Someone that can change an ini file already gets the choice - however it’s not easy to discover. The system could ask the user but that can go wrong as well. We will continue to improve our system, maybe using a benchmark or ask the user but we also would like to develop a good system that can make the decision for the less knowledgeable user. Sorry if that caused trouble and thanks for letting us know so we can quickly fix the issue.

1 Like

Hey question im builden a PC.I know many belive I shouldnt but…Should I set up my pc power to match what im developing for.Im make a fighting game for Xbox One.The power of the Xbox One matches the Amd Radeon 7790 HD grapic card.Should I get that grapics card ??? Will Unreal run good on it,like the elemento demo.I seen the comparisons video for ps4.Wasnt bad.But I dont know what grapics card the comparison video was useing. What do you all think??

Well in it’s not always about the graphics card. The one you are looking at is indeed a good one. I have an GeForce 640. Which is pretty good for the quality I desire. but my computer lack is RAM and CPU power. I have 6GB or RAM and a measly 2 core i3 Intel CPU. And the CPU is what sets my pc back a bit. So you may get an awesome Graphics card, doesn’t mean your pc can keep up with it. That’s what I have been told. Anyway! I hope I helped ya out in some way.

no, build the best pc you can afford! Fair enough if you are just building a test machine with similar spec to a xbox one (even this wouldn’t be a direct correlation as the xbox is running a slimed down windows) but if you are actually using it to develop the game then there is no question, go overkill, nobody develops games on xbox one level hardware for xbox one level hardware.
If you were creating a IOS game would you build a computer with IOS specs? mmm I don’t think so.

Dont know if this is the right place to post this but Im getting a laptop in a few days and was wondering if the specs are good enough to run UE4.
(I know a desk top would be better but I need a laptop and dont have the cash to buy both)

CPU: Intel® Core™ i5-4200M Processor (2.5 GHz, 3.1 GHz with TurboBoost, 3 MB cache)
GPU: NVIDIA GeForce GTX 850M (2 GB GDDR3)
RAM: Kingston 12 GB DDR3 (1600 MHz)

Thanks in advance, and sorry if its in the wrong thread.

In my opinion GTX 850m should be good enough for UE4, however, you will run into situations where it’s not. I haven’t tried that particular card, so let me talk about my experience:

CPU AMD 3-core 2.4GHz
4GB RAM (upgraded to 8 this week)
Radeon 6950, 2GB video RAM
SSD

My video card has a comparable performance to (desktop) GTX 750 Ti, which I suppose has performance similar to (mobile) GTX 850m. I can run the medium-to-heavy project Landscape mountains OK, but I would definitely prefer a bit higher frame rate than what I get. I can easily build relatively big levels, as long as there are no too complex shapes and shadows. For example, when I add a small forest made of regular bushes with many leaves, things get tough. Then I have to lower the settings in order to get a frame rate which I can work with (anti aliasing, shadows, post processing, resolution etc). However, this card would perform better in combination with a more suitable CPU…

So yes, I think GTX 850m is definitely OK card for UE4, but if you can get a better one, that will only be good for you :slight_smile:
I have also been considering getting a laptop with 850m for times when I have to travel. Choosing a weaker card is simply because I hate too powerful components in a laptop that make it too hot and warm. As far as I know, 850m is 45W, 860m is 75W, 870m is 100W. (and the architecture of 870m is less power-efficient) Depending on how well the laptop cooling works (how efficient and how noisy), you could consider 860m as well. I personally wouldn’t go for a “beast” like a 870m in a laptop that needs to be mobile… I guess only a heavy laptop can comfortably handle such a card, and I am looking for something that I can carry by myself :slight_smile:

Some extra tips:
My CPU is definitely a bit annoying when building lighting, baking etc. I can see all three cores are used at 100% (which is good, good job with the multi-threading UE!), but it takes too much time to build even a relatively small level with a little foliage and two lights - about 1-2 minutes. So I am considering an upgrade to a 4-core i5/i7.
4GB of RAM are DEFINITELY not enough. Really, a minimum of 8GB is mandatory in my experience. I even had cases where the build failed due to not enough RAM. Having an SSD helps if you have little RAM, and it helps a lot, but it cannot do wonders.

I hope to have helped!

This is not working for me.


[2015.02.20-20.51.57:097]  0]LogD3D11RHI: Found D3D11 adapter 0: Intel(R) HD Graphics 4600 (Feature Level 11_0)
[2015.02.20-20.51.57:097]  0]LogD3D11RHI: Adapter has 4015MB of dedicated video memory, 0MB of dedicated system memory, and 4037MB of shared system memory, 1 output[s]
[2015.02.20-20.51.57:111]  0]LogD3D11RHI: Found D3D11 adapter 1: NVIDIA GeForce GT 750M   (Feature Level 10_0)
[2015.02.20-20.51.57:111]  0]LogD3D11RHI: Adapter has 4015MB of dedicated video memory, 0MB of dedicated system memory, and 4037MB of shared system memory, 0 output[s]
[2015.02.20-20.51.57:112]  0]LogD3D11RHI: Found D3D11 adapter 2: Microsoft Basic Render Driver (Feature Level 11_0)
[2015.02.20-20.51.57:112]  0]LogD3D11RHI: Adapter has 4015MB of dedicated video memory, 0MB of dedicated system memory, and 4037MB of shared system memory, 0 output[s]
[2015.02.20-20.51.57:112]  0]LogD3D11RHI: Chosen D3D11 Adapter Id = 0

r.GraphicsAdapter = -2 Same result
r.GraphicsAdapter = -1 Same result
r.GraphicsAdapter = 0 Same result
r.GraphicsAdapter = 1 Gives me error > DX11 feature level 10.0 is required

Any ideas?

EDIT> Fixed by updating the drivers!

The same issue for me. Any news about it?

Edit

Yeah I made it run. I have a laptop using Nvidia Optimus. No matter what a set for r.GraphicsAdapter Unreal always choosed Intel HD 4600. I noticed in logs that it might be because there is in fact no output on GeForce graphic. All output goes trouhgt Intel so all devices are connected to it.

So I made a fake display connected to GeForce acording to this: http://www.helping-squad.com/fake-connect-a-monitor/

Now when there is an output on both adapters UE choosing GeForce finally!

Cheers

Is there any more recent solution for this problem?
I’m currently trying to run Unreal Editor 4.14.3 on my machine (Intel i7 6900K and EVGA GTX 980 Ti), but the Editor refuses to pick my GTX 980 Ti as the adapter and always chooses the integrated Intel HD530. As you can imagine, the results are poor to say the least, while very powerful hardware is readily available. Frustrating …

Below is the relevant excerpt from my most recent log.
I have tried editing the* ConsoleVariables.ini* file, adding ‘r.GraphicsAdapter=-2’ or ‘r.GraphicsAdapter=1’ to no avail. To make things worse: if I disable the Intel HD530 - trying to force the Unreal Editor to choose my GTX 980 Ti - it reverts to the ‘Microsoft Basic Render Driver’! This CPU run driver fails to deliver any performance whatsoever …
[FONT=Courier New]
[2017.03.02-12.41.03:180] 0]LogD3D11RHI: D3D11 adapters:
[2017.03.02-12.41.03:208] 0]LogD3D11RHI: 0. ‘NVIDIA GeForce GTX 980 Ti’ (Feature Level 11_0)
[2017.03.02-12.41.03:208] 0]LogD3D11RHI: 6102/0/8118 MB DedicatedVideo/DedicatedSystem/SharedSystem, Outputs:1, VendorId:0x10de
[2017.03.02-12.41.03:306] 0]LogD3D11RHI: 1. ‘Intel(R) HD Graphics 530’ (Feature Level 11_0)
[2017.03.02-12.41.03:306] 0]LogD3D11RHI: 128/0/8118 MB DedicatedVideo/DedicatedSystem/SharedSystem, Outputs:0, VendorId:0x8086
[2017.03.02-12.41.03:310] 0]LogD3D11RHI: 2. ‘Microsoft Basic Render Driver’ (Feature Level 11_0)
[2017.03.02-12.41.03:310] 0]LogD3D11RHI: 0/0/8118 MB DedicatedVideo/DedicatedSystem/SharedSystem, Outputs:0, VendorId:0x1414
[2017.03.02-12.41.03:310] 0]LogD3D11RHI: Chosen D3D11 Adapter: 1
[2017.03.02-12.41.03:314] 0]LogD3D11RHI: Creating new Direct3DDevice
[2017.03.02-12.41.03:314] 0]LogD3D11RHI: GPU DeviceId: 0x1912 (for the marketing name, search the web for “GPU Device Id”)
[2017.03.02-12.41.03:314] 0]LogWindows: EnumDisplayDevices:
[2017.03.02-12.41.03:314] 0]LogWindows: 0. ‘NVIDIA GeForce GTX 980 Ti’ (P:1 D:1)
[2017.03.02-12.41.03:315] 0]LogWindows: 1. ‘NVIDIA GeForce GTX 980 Ti’ (P:0 D:0)
[2017.03.02-12.41.03:315] 0]LogWindows: 2. ‘NVIDIA GeForce GTX 980 Ti’ (P:0 D:0)
[2017.03.02-12.41.03:315] 0]LogWindows: 3. ‘NVIDIA GeForce GTX 980 Ti’ (P:0 D:0)
[2017.03.02-12.41.03:315] 0]LogWindows: 4. ‘Intel(R) HD Graphics 530’ (P:0 D:0)
[2017.03.02-12.41.03:316] 0]LogWindows: 5. ‘Intel(R) HD Graphics 530’ (P:0 D:0)
[2017.03.02-12.41.03:316] 0]LogWindows: 6. ‘Intel(R) HD Graphics 530’ (P:0 D:0)
[2017.03.02-12.41.03:316] 0]LogWindows: DebugString: PrimaryIsNotTheChoosenAdapter PrimaryIsNotTheChoosenAdapter PrimaryIsNotTheChoosenAdapter PrimaryIsNotTheChoosenAdapter FoundDriverCount:3
[2017.03.02-12.41.03:316] 0]LogD3D11RHI: Adapter Name: Intel(R) HD Graphics 530
[2017.03.02-12.41.03:316] 0]LogD3D11RHI: Driver Version: 20.19.15.4463 (internal:20.19.15.4463, unified:20.19.15.4463)
[2017.03.02-12.41.03:316] 0]LogD3D11RHI: Driver Date: 5-25-2016

Is there anything I can do to fix my problem?
This seems to be an issue/problem in the heuristic responsible for choosing the correct GPU. I already tried a clean driver uninstall, forcing the application to run on the GTX 980 Ti through the NVidia control panel, nothing works …

Any help is greatly appreciated!

For a desktop I would just go into the Geforce Control Panel and set it to use the GTX 980 Ti for everything rather than having the system try to figure out which one to use. I don’t think you’d need to use the Intel GPU for anything.

I absolutely agree, but disabling the Intel HD530 makes Unreal revert to the ‘Microsoft Basic Render Driver’ (as I stated before), making Unreal run completely on CPU. Not exactly what we want … Disabling the Intel HD530 doesn’t get me any further.
In the Geforce Control Panel I have no option to choose what GPU will run a specific application:

As an illustration: when I monitor what applications are running on the GPU (using the NVIDIA GPU Activity screen - can be enabled from Geforce Control Panel), it is very clear that Unreal Editor is not running on the GPU.

I can’t check right now since the systems I’m using don’t have an Intel GPU, but there’s definitely an option in there where you can set a global setting to use the Nvidia GPU or you can try to manually set each program to try and use a specific GPU or have it automatically try and figure it out. I have a Razer Blade laptop and had to force the system to use the Nvidia GPU to do VR otherwise it wasn’t running well even when I had manually set the games to use the Nvidia GPU. I can’t remember exactly where in the Geforce Control Panel the option is though.

Following up on this with a related question. We have rendering machines here with 6 GPU’s per box and we want to run UE4 instances on each GPU so we can distribute the rendering out across all 6 gpu’s… I believe this is possible to go into the nvidia panel, manually select the gpu, then run one build, go back, select a different gpu, run another instance and so on but this is tedious… We would much rather be able to edit an .ini and run 6 instances with configured ini’s and be done with it.

Adding and modifying r.GraphicsAdapter = 4 in the ConsoleVariables.ini does nothing unfortunately, the build always comes up using the same GPU regardless… Any ideas?

Anyone? It kind of boring to have extra GPUs just chiling around. Would be great (and I guess not so hard) to make not *.ini, but environment variable or command line option. Epic, its seriously make a difference for a lot of people. (we talking not about integrated and nvidia gpu, but multiple nvidia GPUs). If you possitioned Engine as serious production tool, it is must have feature.

BTW ini “solution” from original post doesnt work.

Can confirm what those above are saying in Ubuntu 16.04 and UE 4.22 adding r.GraphicsAdapter=1 to ConsoleVariables.ini results in it still running on device 0

Epic stuff, can you update on it? It is just must-to-have option to just select GPU o render on…

Still having this problem!

Entering r.GraphicsAdapter=1 in UE5 EA doesn’t do anything. Adding it to D:\Content\Unreal Projects\MyProject5\Config\DefaultEditorSettings.ini as r.GraphicsAdapter=1 also doesn’t do anything. I also cannot find any file ConsoleVariables.ini. I only see a D:\Content\Unreal Projects\MyProject5\Saved\Config\ConsoleHistory.ini. Tried this one as well but doesn’t work. I created a file D:\Content\Unreal Projects\MyProject5\Config\ConsoleVariables.ini and added r.GraphicsAdapter=1. Doesn’t work as well. Anyone could do it and maybe can tell how exactly it works ?

Found it: unreal 4 - ue4editor.exe GPU selection - Game Development Stack Exchange

Seems not to work anylonger. Wonder how many user have this Issue and there seems to be no solution.