Ark: Survival Evolved: Low fps becaue of UE4 or due to no optimizations?

AC Unity probably does some fancy stuff with the Crowd-AI or whatever you want to call it. If I remember correctly there are also some very nice clothing simulations going on. If done correctly I am sure you can set these things up in a way that all the idle time of the CPU is used for those things. But that might not be that useful for the majority of games so a general purpose engine can live without it.

Just to put your theory to a test, I ran the Witcher 3 and looked at the CPU usage, 33%. big in-house engine and its only utilising one or maybe one and a half of my cores. Its pretty baselss to look at the CPU usage of a game. the fact is UE4 does utilise multi core machines by running threads on different cores, just like any other engine. Praseodyl pointed out already that AC probably uses more CPU because of the simulation stuff in their engine that runs on the CPU, same can be accomplished in UE4.

Out of the box? That’s exactly what I want to point out.
It would not use more CPU out of the box, even if it had to and if it would help. It can be accomplished, for sure, but only if you go into the source code and modify there.
Again, I am only talking about CPU bottlenecks here. I am aware of GPU-CPU interplay.

Look its pointless in arguing here, I have shown that other big engines also don’t utilise 100% of the CPU at all times, just like UE4. The functionality to thread your own simulations is a part of the UE4 SDK. Lets leave it at that, your not going to accept that UE4 is sufficiently multi-threaded, and im not going to accept the opposite.

Interesting. I wasn’t even aware of that.

BTT: I can imagine the reason why many of the published games using UE4 have performance issues is because of their early release. It s probably a valid marketing strategy to publish one of the first games made with a new engine. Since optimizing a game takes time and resources the optimization is often skipped in order to release the game before the competition does. This means that using UE4 and releasing badly performing games might be correlated but this does not mean that the games are performing badly because of UE4.

…but it might as well mean exactly that!
Believe me, I know a hardware eater if I see one! (not a valid point that make, I know)

Ok, I accept! Stalemate!
Fine with me. We will go sooner or later into the source anyway. It would have been nice to do without it for multithreading and have better out of the box performance from what claims to be a game engine.
Future indie/hobby/amateur developers who are not able to modify the source code, will have the cpu parts of their game capped to 1…2 cores.
At least they do not need quadcore CPUs then and their hardware budget will suffice.
And the max requirements line of the released producr can also be copy/pasted: ‘a fast dual core’ :slight_smile:

Would be interesting to get a statement from Epic about this.

Using afterburner my gpu Usage in this game hardly get 80% (on low or epic ). With unit stat the draw calls seem bottlenecking this game.

Config 980gtx@1500 with a 5820k @4.5.

“Please support dynamic and static batching to minimize drawcalls for mobile and low-end PC”](Please support dynamic and static batching to minimize drawcalls for mobile and low-e - Feedback for Unreal Engine team - Unreal Engine Forums)

And maybe for high-end PC too :slight_smile:

5000 sprites render test

Implying UE4 shouldn’t claim to be a game engine sounds a little arrogant with all due respect.

UE4 is multithreading in common/universal problematic areas. For example, recently they implemented animation related multi-threading, an area that benefits everyone.

I think we don’t need to worry on CPU usage simply because DX12 and metal will improve CPU usage within 6months to such an extend that likely most games are going to be GPU bound very soon and it might be preferable to allocate resources on GPU related optimization such as what we’ll get in 4.8 (LOD groups, optimized post process etc).

Every game is different and universal solutions for everything is not always possible. If you have a game with tons of X that eats all your CPU, it might be your job to multithread it for your needs.

At the end of the day, you shouldn’t use the windows CPU usage graph. Use Stat Unit, that’s all you need to look at to start. If you run into CPU usage issues, get the profiler up and start digging.

ARC issues are caused simply because they did not give a **** on performance yet, have a huge open world and lights are running fully in real time. Additionally they customized the render engine with new untested tech.
The game was made in only 6 months and will greately benefit from incoming features in 4.8.
From what I’ve played, I think its pretty clear that the client is GPU bound, not CPU bound so optimizing multi thread is not going to solve anything here unless you have a really old CPU. Its likely a drawcall and triangle count issue + new tech stuff, each of which can be improved mostly easily.
This said, I’m pretty sure it will still require a good GPU to run properly simply because they are pushing hard.

The good news is that extreme performance eating big budget games such as ARK or the more reasonable Fortnight drive UE4 optimizations for all of us :slight_smile:

Multi-threading isn’t something you can magically fix on a global scale within the base engine - it’s up to your team to figure out how to best engineer your game and optimize where you need it. Even at big publishers like Ubisoft/EA they highly customized their engines to fit each game’s needs - their engines aren’t just off-the-shelf solutions that do everything.

Essential multithreading is! Take PhysX for example. Natively multithreaded, bur capped in UE4. They are working on async scene checks, see experimental option under physics tab in Project settings, but this is really something that should be there on release.
Also, I am sure all the kids here that started with UE4 because it is free and aimed at non-professionals are very happy to have learned, that their game will never run smoothly using UE4 if it gets bigger, because if they want it to, they have to go into the source code, which most likely, most of the Blender-Youtube kids can not. I do not understand Epic…if they decide to go for broader public, then only with a mouth-ready product. Not this low level scripting GUI. It is not like they code for their in-house engine.
Phew, will I be happy to be back at work and work on a proper engine again. The moment you want to do something bigger scale, the thing goes downhill. Good Luck, Epic!

(Just another crash right now, btw)

I would suggest anyone who is reading this thread and deciding whether or not to use UE4 due to performance, please disregard PolyPlant’s statements, it is very achievable to create a big game in UE4 and keep it running smooth and looking good, I know this, because I am currently at that stage. Of course it is also very easy to create a low performance game if you are not careful. So ensure that you learn about optimization.

I think its worth to take into account that the developers for ARK.
Implemented there own way of handling lighting in the game world.

If you check out the UE Twitch stream with them he gives a brief overview of how day did GI.
Needless to say that comparing ARK to anything that UE 4 supports out of the box is not really a option.
Since they are running there own implementation under the hood.

For what purpose are you using complicated physics simulations in a game that would need that many cores? If you are creating a game it is probably a better idea to fake most of the effect anyway. You have to keep your target audience in mind: Who has actually that many cores available and who of those people would be interested in the game you are making? Make sure that in the end your target group consists of more than a handful of people. If you want a very very simulation heavy game, the GPU might be the better choice for that anyway. Too bad that the GPU is usually busy rendering stuff.

I am not saying that there is now way any game would ever need that but it would be a very special case. You cannot expect a general purpose engine to be suited for each and every corner case game concept without customization. I don’t know all the game engines out there but is there any engine that miraculously scales every subsystem to exactly fit your needs? If you want something unconventional you probably have to put in some work yourself. And as mentioned before: UE4 gives you the interface to implement these things the way you like.

Just out of curiosity: What engine are you working with professionally?

Hang on… if you can enter ‘Stat Unit’ in a shipped game then they haven’t even built a Shipping Version, you shouldn’t be able to use any commandline stuff at all. Sounds to me like they didn’t bother.

ARK is Early Access so you could argue they haven’t shipped anything yet.

I can only assume that, by PolyPlants logic, Notepad which is not multi threaded should take up 100% of 1 core when you open it. Any number lower than that and Notepad must not utilize the CPU very well?

That does make any sense, actually, assuming Notpad runs on 3 fps and every input takes 2 secs to process.

Go on, buddy, drive fast, hit the wall…hit the fps wall… be happy and all fine and dandy until then, push the pedal…yippee!, everything is pink flowers and candy :slight_smile:

The only reasom why there are no statements from UE4 Devs yet is because they know they are behind with essential multithreading. I mean look how they try to incorporate async in the physics tab in the Project settings…experimental… I mean, you tick it and the Editor crashes. They know they are behind…they know…believe me.

If you watch the Unreal Engine stream about the game he talks about their method of rendering, that should give you a good idea of why there might be a fps drop at this stage of the games development

Let’s hope it will not take too long before we see new optimisations.

Non-professionals will never get their game to the point where ‘poor’ (your words) multi-threading will hinder them. ‘Blender-Youtube’ kids will grab prefab and un-optimized assets, throw a bunch of junk in the editor, wonder why nothing works, blame the engine, then jump to the next engine they think will magically work. If their project runs poorly it’s not due to multi-threading.

Now we see the real purpose of your posts - just to take shots at the engine. Goodbye.