Very bad performance even with empty scene on intel hd graphics


I was working on a little hobby 2D project on Unity and decided buy UE and give it a try, since I find myself very confortable with C++.

However, I noticed that any project or even an empty project, with the skybox hidden etc, I get VERY bad framerates (10-13 fps, editor window maximized on my 1920x1200 monitor, default window layout). I even tried to hardcode the runtime to disable AA, and the performance is still very poor.
Changing the viewport setting from “Lit” to “Unlit” helps a bit (~20fps now) but it’s still very low. Even the Tappy Chicken project runs slow.

I’m running this on a i5-3570 CPU+GPU (HD4000), and sure enough it’s not a high end GPU, but I’ve been impressed several times in the past how well it handles some loads. Hell, Dota2 on linux running at 1080p in this GPU feels faster than an empty scene in UE4Editor.

I’m wondering if this is expected with UE4 on this GPU and it just isn’t suited for light/small games on relatively low-end hardware, or if it’s just a problem I’m stumbing on…

More details:

  • I’ve compiled 4.1 editor and the latest preview build (“Development Editor” configuration), both with the same result.
  • The CPU usage is ~8% idling in the editor.
  • I’ve got a clean windows 8.1 installation.
  • I’ve got the most recent drivers installed.

Take a look at this thread: [Epic Official] Read this if you are experiencing FPS drops after the 4.1 patch - UE4 AnswerHub Probably some suggestions that you can find in it will help you :slight_smile:

Thanks for the link, but neither solution worked.

The graphics adapter choice wasn’t the problem since the latest preview build already improves the heuristics used to select the graphics adapter. I confirmed in the log just to be sure and it is selecting the Intel adapter.
The QFE patch also didn’t work. Not sure what it changes either…

Sure, I know my hardware has limited capabilities, however that doesn’t really explain how changing the viewport’s settings from “Lit” to “Unlit” on an empty scene (only the grid is rendered) changes the performance so much.
I still find it hard to believe the GPU can’t handle the editor showing an empty scene and suspect some bug or bad behavior is the cause.

Unfortunately I can’t properly profile the code using visual studio express edition to see where the poor performance originates from…

There is a massive different in the requirements for Dota2 and UE4 for both the minimum and recommend, not sure why you mentioned this.

Looks like you need to save up and buy a new graphics card…

Unfortunately for you Lit and unlit performance does actually explain the performance. Unlit is akin to very simple fixed pipeline per-vertex lighting of 10 years ago. Lit mode turns on materials and shaders (all the programmable pipeline) which then adds a huge amount of complexity as the graphics pipeline is now performing customised light calculations per-pixel (current state of the art).

The HD3000 is obviously programmable but just doesn’t have as many thread-paths and speed of a dedicated modern GPU. Therefore it struggles with the complexity of current state of the art shaders and therefore you see a huge performance increase when you turn off the that part of the rendering pipeline.

This isn’t an issue because Epic haven’t optimised enough, this is an issue because current state of the art demands more parallel processing than it can handle.

Whilst not a perfect comparison, the HD3000 hardware has 12 execution units whereas a Nvidia 670 has over a thousand.

without me running some tests, does anyone know approx what percentage overhead running/simming a level/game in the editor takes (vs a final cooked/built/blah blah .exe game). Any rough guesses? or has the info been posted (say a 10% hit in performance when running in the editor vs final executable). thanks.

edit running on a GTX780 here which I’d consider the future ‘minimum’ for GOOD looking UE4 games :wink: and even that will soon be too slow (esp for VR)

My opinion: Intel HD is not for games, its only for office GUI applications.

I understand all that’s been said and it sure is possible that the into gpu just doesn’t cut it for ue4. Nothing wrong with that, I was just trying to make sure it’s not a bug.

That said, if it’s true that I’m not gonna be able to get more performance out of this GPU, this thread may be useful to others trying to build small 2D games like I was trying to do. UE4 won’t handle that optimally for now.
I’ll just wait until there’s simple forward rendering then, and/or try to do it myself if I find the time. After all that should be important in this age of indie gaming and mobile devices.

Thank you all for the answers :slight_smile:

I decided to give a deeper look at UE4 source, and found code for forward rendering there already, and found that the D3D11 RHI actually supports simulating ES2-only support, which ends up creating a Forward Renderer instead of the default Deferred renderer.
Forcing this code path ended up crashing the editor at a line “check(!AllowHighQualityLightmaps());”, which I forced to return “false” to pass. This enabled me to run the editor, however the fonts aren’t rendered except for the FPS counter.


As you can see, I now get 60fps on this almost empty scene, with enough eye candy for most simple games.

I guess there isn’t that much work to do to support low end GPUs after all. I feel a little more confident now, and may research further. However I’d still like to hear from someone at Epic if there’s any intention to improve this kind of situation.

There won’t be any plans to support lower-end systems because the engine is built for next-gen, so running on a machine that’s not even current-gen is naturally going to cause problems. You can’t have legacy support for everything all under the same engine. It’s the same reason you can’t use Windows 7 on a laptop from 10 years ago with Win 98 on it, the specifications and hardware are simply not there.

The Forward Renderer is not complete code, and everything has been tailored for the deferred rendering system. It’s not really possible to just switch rendering paths, and you’re setting yourself up for a LOT of headaches by doing that. Note that most complex materials, and most components of materials will not work either, like reflections and anything but static lighting. Slate is also made for the Deferred Renderer, so you won’t be able to make any UI’s either (The Engine is made in Slate, hence why it looks buggy as glowy in your screenshot there).

Changing AA type won’t work either, the Engine uses FXAA by default which is (I believe) one of the quickest and less processor intensive AA methods around right now. You’re GPU is a major bottleneck, and unless you change it you won’t be able to develop properly in UE4.

Isn’t HD 4000 not just low end, but integrated graphics? I don’t know why you’re surprised that you can’t run a notoriously power-intensive visual program when you don’t even have a graphics card.

Sorry, but I’ll only believe that comming from an Epic engineer. If the Forward Renderer code is there, there must some interest in it. It appears to be used for mobile.

Well, I’m sure no one expects the same functionality from a Forward Renderer as a Deferred Renderer. That’s not my intension and I don’t need that complex renderer for a simple 2D game either… Slate isn’t really blurry, the screenshot is blurry because it’s scaled down; you can see the operating system windows and text is also blurry because of that. I don’t know why the font is not rendering yet, but I suspect it’s just an incompatible material/shader or something which should be easy to fix.

I can assure you changing AA type in the deferred renderer works. I managed to disable it rather easily.

It’s a new generation of integrated graphics, it’s not the old intel integrated graphics we’re used to hear about. It actually supports OpenGL 4 features and so on, and it’s actualy impressive considering it’s “just” an integrated GPU. As you can see in my previous screenshot, the GPU doesn’t have problems rendering not-so-complex scenes, and could handle a simple game without much effort. Plus, Paper2D is comming, and I don’t think it’s reasonable to ask the user for a GTX 780 GPU to play Flappy Bird.

I’m interested because there doesn’t appear to be much reason the engine couldn’t handle a low end GPU, and it’s Open Source, and it’s C++.

[Edit] Also, can we PLEASE be technical on the real reasons UE4 couldn’t handle low end GPUs? That would make the thread a lot more interesting…

If UE4 wants to play in the mobile game, it must run on that graphics card. I think it will run if epic has time to optimize. Because an empty scene should use almost zero resources if the engine is optimized (and post-processing turned off).

If this should be the engine of the future, it must support forward and backward rendering pipelines. I want a flexible engine. I think some of the newest trends in graphics programming make backward obsolete again in the long run, so it goes from forward to backward to forward again.

Just managed to properly render the text in the editor. It was just because the alpha channel is different with the ES2 profile I was using (to make the renderer use forward rendering).


Link to original resolution image:

How is the game running if you just run the game without the editor? (in deferred)

I get an error (unrelated to this work) when trying to publish the game, and I haven’t looked into that yet, but I don’t see why it would run with a deferred renderer if I hard coded it use the forward renderer.

In any case, I don’t think the difference in performance in or outside of the editor would be very significant (if using the same render paths)…

Running within the Editor is going to be quite a bit slower than running the .exe directly. If you hookup the profiler in the editor, you’ll see Slate biting into your performance quite a bit. For 60 fps, you want to hit 16.6 ms. I think Slate consumes bout 2.41 ms by itself. When you run in a game .exe I bet you don’t have any widgets, etc so Slate shouldn’t be a factor