Egh, well I think this is an important distinction to make. The game is successfully running on multiple cores, but it is not using multiple cores worth of total processing power. Your choice in terminology is misleading.
And this game using only ~20% of that person’s CPU is not necessarily a bad thing. What it likely points to is the game being GPU bound (as mentioned by Syed). Of course none of us here (as far as I know) are developers for Ark so we can’t do much in the way of profiling.
My claim is based from what I saw in our project. It is easy to replicate. Simply do a complex scene if you do not already have a project that is heavy. Maybe just repeat scene elements with heavy lighting, or do a large foliage tool setup with many instances or throw many physics enabled object into your scene. Do it until the frame rate drops. Then look at your cpu usage. It will not be using more than 1…1 1/2 cores overall. I would actually be really happy to be proven wrong and how it can be achieved. So far we had no success.
Again, I am not trying to black paint UE4 here for users who like it. It is just that everyone with a larger game project will sooner or later run into the, what seems to be, very rudimentary multi-threading of UE4 and wants to get their frame rate up again by using the cpu power that is left unused. (of course if the game is not gpu bottlenecked, but cpu)
Do no misunderstand that for ‘fan-boy’ stuff or whatever. In the long run, you do not do yourself any favor by sugar coating an issue like that for sake of keeping up the ‘all perfect UE4 World’ in your mind.
Probably what was observed is a completely unoptimised (basically you just throw things, as long as it works) pipeline and asset. And as a result, the engine is abused. If indeed this is the case, then that is not what UE4 is designed for.
To be sure, design the level as if it is meant for production, and again produce the core usage here. Then we are talking…
No, we spent a lot of time optimizing. Especially level design. At some point though, you can not strip down anymore and have to rely on multi-threading. It is what any modern game does.
Try yourself with my test case suggestions from above.
If UE4 is poorly using remaining cpu power if frame rate drops (due to cpu bottleneck), then it is an engine problem. There is nothing you can do to solve poor multithreading with level design etc.
You will see.
To be honest, I have, because I know how tricky it can be to get proper multi-threading running.
I quickly checked.
He talks about multithreading in UE4. It is just a prime number test case, which can easily multithreaded. Multithreading an entire game is more difficult.
But it shows that UE4 is not multithreading well out of the box. Otherwise he would not have to set it up by himself.
It is just a small example, but any other thing that you create in your game will have the same issue of not being properly multi-threaded.
If you throw stuff to render in your scene until the frame rate drops, you actually reduce the need for multithreading. When the time between to frames becomes longer, the CPU has much more time catching up with the stuff it has to do in that frame’s ‘Tick’ functions. The result is probably a CPU that is idling most of the time.
What I am trying to say is that the method you are proposing to prove that multithreading in UE4 is inefficient is probably not appropriate.
I am not saying that multithreading in UE4 is good or bad. I simply have no data that points one way or the other.
If my proposed method to test multi-threading/cpu usage in games is not right, then what would you need it for in games at all. In games all you need multi-threading for is more objects in the scene, better lighting, more physcis-enable objects etc., so more going on screen.
The foliage tool test case is a good example. You want grass as far away from the player as possible and not cull it to early to avoid naked polygons being visible. It is the classic example for performance/fps critical elements in a level and you do not want to have cpu idling if fps drops because of it. There are many more examples.
Essentially, what I want to bring across is, that when you ship a game done in UE4, and you specifiy the maximum requirements, then you can write:
maximum cpu: dual core
This actually makes a lot of sense. Open the console and type “stat unit”. Have a look at the values and you’d be surprised Having a scene with a lot of objects means the CPU doesn’t have anything to do while the GPU is rendering all the stuff. Most of the time the CPU just waits until the GPU is done rendering.
Great, the UE4 audience manages to label CPU multi-threading in games as irrelevant, because every element in the game only uses GPU. I wasted the last years working on engines
Here is another thing to try:
Try any other game. For example I just checked AC Unity. Run in windowed mode and have the task manager open next to it.
You will see what I mean. It uses almost 100% cpu power. You can try this with any other big in-house engine game.
UE4 is really lacking multi-threading performance. It is just the way it is.
But then it is for free and you can keep sugar talking it for yourself.
Hardly any indie/hobby/non-professional user using it will ever make a game that will hit a fps wall anyway.
But remember, do not buy a quadcore or higher if you want to use it for UE4. A dual core is enough, and instead, use that money you saved to buy a book about multi-threading
I think most games actually don’t need more than one or two cores. In many cases one would probably be enough but since the OS requires some resources as well a second one might be a good idea.
Most of what games give you is eye candy on top of more or less simple game mechanics, and making eye candy is in almost all cases the job of the GPU. The actual game logic is often very primitive. For example the game logic of Star Craft 2 is not much different from Star Craft 1. So if Star Craft 1 ran on machines from the 90s (is that right? I am not sure) then a modern CPU can do that stuff easily.
The only cases I can think of where the CPU might be critical are very simulation heavy games with large amounts of physics or maybe some sort of complex economic simulation. Maybe a very complex AI system can also make good use of some additional CPU power.
AC Unity probably does some fancy stuff with the Crowd-AI or whatever you want to call it. If I remember correctly there are also some very nice clothing simulations going on. If done correctly I am sure you can set these things up in a way that all the idle time of the CPU is used for those things. But that might not be that useful for the majority of games so a general purpose engine can live without it.
Just to put your theory to a test, I ran the Witcher 3 and looked at the CPU usage, 33%. big in-house engine and its only utilising one or maybe one and a half of my cores. Its pretty baselss to look at the CPU usage of a game. the fact is UE4 does utilise multi core machines by running threads on different cores, just like any other engine. Praseodyl pointed out already that AC probably uses more CPU because of the simulation stuff in their engine that runs on the CPU, same can be accomplished in UE4.
Out of the box? That’s exactly what I want to point out.
It would not use more CPU out of the box, even if it had to and if it would help. It can be accomplished, for sure, but only if you go into the source code and modify there.
Again, I am only talking about CPU bottlenecks here. I am aware of GPU-CPU interplay.
Look its pointless in arguing here, I have shown that other big engines also don’t utilise 100% of the CPU at all times, just like UE4. The functionality to thread your own simulations is a part of the UE4 SDK. Lets leave it at that, your not going to accept that UE4 is sufficiently multi-threaded, and im not going to accept the opposite.
BTT: I can imagine the reason why many of the published games using UE4 have performance issues is because of their early release. It s probably a valid marketing strategy to publish one of the first games made with a new engine. Since optimizing a game takes time and resources the optimization is often skipped in order to release the game before the competition does. This means that using UE4 and releasing badly performing games might be correlated but this does not mean that the games are performing badly because of UE4.
…but it might as well mean exactly that!
Believe me, I know a hardware eater if I see one! (not a valid point that make, I know)
Ok, I accept! Stalemate!
Fine with me. We will go sooner or later into the source anyway. It would have been nice to do without it for multithreading and have better out of the box performance from what claims to be a game engine.
Future indie/hobby/amateur developers who are not able to modify the source code, will have the cpu parts of their game capped to 1…2 cores.
At least they do not need quadcore CPUs then and their hardware budget will suffice.
And the max requirements line of the released producr can also be copy/pasted: ‘a fast dual core’
Would be interesting to get a statement from Epic about this.