Ark: Survival Evolved: Low fps becaue of UE4 or due to no optimizations?

Ark is the first large scale multiplayer game that I’ve seen with UE4 so far, and to be honest it does not make UE4’s performance look good. I know alot of people will say it’s just not optimized properly, even the developer on the stream said it, but is this really true? Or is our hardware just not ready to support games like this yet?

I have talked to different ue4 users and some have said that a game world of this size with this type of lighting, materials, foliage etc in ue4 is ahead of our current hardware.

To me this seems true the more I use UE4 and do my own optimizing on large environments. I didn’t create this thread to bash the game or ue4, but I’m working on a game world of similar size and quality, and to see it in a playable state all come together with such low performance on such high end hardware (I mean my 980 gets 30 fps with low settings and others with titan’s only get 30-40 fps!) It makes me concerned for my own project.

However, even though there does seem to be some LOD’s in Ark and optimizing done, when I look at the floor or the sky my fps still remains low. Is this a big reason why it doesnt run well? Because of no proper frustum occlusion culling?

Or is this the reason (and what can we do about it?):

Unreal 4 is a work in progress so until it’s stable one should consider their projects as first adopters tied to the current upgrade and Epics road map. I can at least verify that the overall lighting and materials have been improving with each release, more so in 4.8 preview, and the rumor is the large world kite map will be released with the gold version of 4.8 so as to the current state of the art it will make it easier for those with large scale projects in mind make an informed decision.

Opinion wise if a project starting now has a 3-4 year development cycle I see no issues as to scaling the project as needed as features are made available and and rather pleased with the foliage improvements from 4.7 to 4.8. Granted Ue4 is not what I would consider a fit to finish engine but if you install future proofing into you design objective then it can grow as the engine features grow and considering the front costs is a more than fair trade.

It seems to me with Ark, they threw everything into the game without any sort of performance concerns. A fully dynamic game is going to hard to pull off in any engine.

If you run the game in windowed mode, try having the task manager next to it, or at least the cpu usage part to see how much percentage is used. It is not a perfect methode to measure cpu usage, but definitely shows you in what ballpark it is playing.

We also run into performance issues with our development because UE4 is practically not multithreaded, which is basically a must for any modern game.

Saying UE4 is still young is a bit invalid. They developed in for some 10 years before release. Multithreading is a core part of an engine and if it is not there at release, it is most likely never to come.
Promise: Everyone whose project will reach an advanced stage where a lot is going on on screen and under the hood will post or somewhere else about performance issues sooner or later.

UE4 is practically single core.

People celebrate that it is for free, but I would prefer subscription. In the end, to hire software guys who attack the multi-threading, it needs money.

Not sure where you are getting your facts from, but this isn’t true. is a quick run I did in standalone mode, no editor open, UE4 is using at least 5 of my logical cores (clear rises when starting and executing, clear falls when executing had stopped), and this is a bare minimum map:


Now this is in no means any kind of professional test or anything, just a quick run, view and exit to see. But as far as I can tell, its definately threading decently.

Not to mention that ARK is definitely GPU bound. It seems like the textures and materials are unoptimized, mainly. If you do “stat unit” in the game, CPU is taking barely any frame time. Most likely the reason sees what he does with the game is that the GPU is taking so long to draw that the CPU is idling.

I’m as curious as anyone to figure out why ARK’s performance profile is so strange, but I’m going to say it’s definitely not anything to do with UE4 not utilizing the CPU well.

Well I know this is going to be a discussion without end, but if you actually look closely at your graph, you see that UE4 is not using more than 1…1.5 cores overall. I marked blue what you have to count. do not forget to substract usage from other processes on the same core (the offset). If you stack it up you see that it amounts only to 1…1.5 cores. This is what I mean. If you do proper profiling, you see it more clearly that only 1…1.5 cores of 8 cores are used, or . ~ 20 percent of your cpu power.
So far I have not seen UE4 use more no matter how low fps, maybe I am wrong. I do not want to critize UE4 . It is just that more and more people will complain anyway as soon as their project gets bigger and they hit the fps wall.
In the end it is useless to discuss it . I promise you, as soon as you go full scale game, you will be posting what I post.


I think multi-threading (or ability to use multi-core effectively) is obvious design decision undertaken by any games engine. The observation using Task Manager may not deliver accurate pictures as someone pointed, it could be GPU bound or something else (specific to the games).

It would be nice though for UE4 developers can come down and explain the multi-core usage in UE4.

Egh, well I think this is an important distinction to make. The game is successfully running on multiple cores, but it is not using multiple cores worth of total processing power. Your choice in terminology is misleading.

And this game using only ~20% of that person’s CPU is not necessarily a bad thing. What it likely points to is the game being GPU bound (as mentioned by Syed). Of course none of us (as far as I know) are developers for Ark so we can’t do much in the way of profiling.

My claim is based from what I saw in our project. It is easy to replicate. Simply do a complex scene if you do not already have a project that is heavy. Maybe just repeat scene elements with heavy lighting, or do a large foliage tool setup with many instances or throw many physics enabled object into your scene. Do it until the frame rate drops. Then look at your cpu usage. It will not be using more than 1…1 1/2 cores overall. I would actually be really happy to be proven wrong and how it can be achieved. So far we had no success.

Again, I am not trying to black paint UE4 for users who like it. It is just that everyone with a larger game project will sooner or later run into the, what seems to be, very rudimentary multi-threading of UE4 and wants to get their frame rate up again by using the cpu power that is left unused. (of course if the game is not gpu bottlenecked, but cpu)
Do no misunderstand that for ‘fan-boy’ stuff or whatever. In the long run, you do not do yourself any favor by sugar coating an issue like that for sake of keeping up the ‘all perfect UE4 World’ in your mind.

Probably what was observed is a completely unoptimised (basically you just throw things, as long as it works) pipeline and asset. And as a result, the engine is abused. If indeed this is the case, then that is not what UE4 is designed for.

To be sure, design the level as if it is meant for production, and again produce the core usage . Then we are talking… :slight_smile:

No, we spent a lot of time optimizing. Especially level design. At some point though, you can not strip down anymore and have to rely on multi-threading. It is what any modern game does.
Try yourself with my test case suggestions from above.
If UE4 is poorly using remaining cpu power if frame rate drops (due to cpu bottleneck), then it is an engine problem. There is nothing you can do to solve poor multithreading with level design etc.
You will see.

TBH, I have little doubt about UE4 using multiple core effectively. I am still new in UE4, will check this out as we progress.

By the way, from the way you described repro case, I definitely sense it is not optimised level.

To be honest, I have, because I know how tricky it can be to get proper multi-threading running.

I quickly checked.
He talks about multithreading in UE4. It is just a prime number test case, which can easily multithreaded. Multithreading an entire game is more difficult.
But it shows that UE4 is not multithreading well out of the box. Otherwise he would not have to set it up by himself.
It is just a small example, but any other thing that you create in your game will have the same issue of not being properly multi-threaded.

If you throw stuff to render in your scene until the frame rate drops, you actually reduce the need for multithreading. When the time between to frames becomes longer, the CPU has much more time catching up with the stuff it has to do in that frame’s ‘Tick’ functions. The result is probably a CPU that is idling most of the time.

What I am trying to say is that the method you are proposing to prove that multithreading in UE4 is inefficient is probably not appropriate.
I am not saying that multithreading in UE4 is good or bad. I simply have no data that points one way or the other.

If my proposed method to test multi-threading/cpu usage in games is not right, then what would you need it for in games at all. In games all you need multi-threading for is more objects in the scene, better lighting, more physcis-enable objects etc., so more going on screen.
The foliage tool test case is a good example. You want grass as far away from the player as possible and not cull it to early to avoid naked polygons being visible. It is the classic example for performance/fps critical elements in a level and you do not want to have cpu idling if fps drops because of it. There are many more examples.

Essentially, what I want to bring across is, that when you ship a game done in UE4, and you specifiy the maximum requirements, then you can write:
maximum cpu: dual core

Any money spend on more cores is wasted.

This actually makes a lot of sense. Open the console and type “stat unit”. Have a look at the values and you’d be surprised :wink: Having a scene with a lot of objects means the CPU doesn’t have anything to do while the GPU is rendering all the stuff. Most of the time the CPU just waits until the GPU is done rendering.

Great, the UE4 audience manages to label CPU multi-threading in games as irrelevant, because every element in the game only uses GPU. I wasted the last years working on engines :frowning:

is another thing to try:

Try any other game. For example I just checked AC Unity. Run in windowed mode and have the task manager open next to it.
You will see what I mean. It uses almost 100% cpu power. You can try this with any other big in-house engine game.

UE4 is really lacking multi-threading performance. It is just the way it is.
But then it is for free and you can keep sugar talking it for yourself.
Hardly any indie/hobby/non-professional user using it will ever make a game that will hit a fps wall anyway.

But remember, do not buy a quadcore or higher if you want to use it for UE4. A dual core is enough, and instead, use that money you saved to buy a book about multi-threading :wink:

Though in what way would multithreading increase performance? What processes would be able to make use of this wisely?

I think most games actually don’t need more than one or two cores. In many cases one would probably be enough but since the OS requires some resources as well a second one might be a good idea.
Most of what games give you is eye candy on top of more or less simple game mechanics, and making eye candy is in almost all cases the job of the GPU. The actual game logic is often very primitive. For example the game logic of Star Craft 2 is not much different from Star Craft 1. So if Star Craft 1 ran on machines from the 90s (is that right? I am not sure) then a modern CPU can do that stuff easily.

The only cases I can think of where the CPU might be critical are very simulation heavy games with large amounts of physics or maybe some sort of complex economic simulation. Maybe a very complex AI system can also make good use of some additional CPU power.