Unreal Engine is broken, why do people use it and like it?

Pretty much. Everything is a tradeoff. Gotta use the right tool for the right job. And then after you discover the secret to immortality, and make your own semiconductor company, with your own standards, you realize your own inventions are imperfect in their own ways.

Group of people that DGAF about Quality Assurance, Performance, or Regressions, before submitting a branch.

Fixed.

@Rawalanche
Steve Jobs: an origin story.
Nice, right down to the pickaxe.

Letā€™s be serious here. The problem is this engine, not the other stuff.
The other stuff you have to comply with for people to be able to use your engine or play your game.
coding a c++ game while making your own rendering pipeline is totally possible. Perhaps even better.

The point of using an engine is just that someone else has done this ā€œtediousā€ parts for you.
You can jump start a project and work on what is needed instead.

The problem is that Unreal doesnā€™t really offer those benefits at all. You mostly just have to reinvent the wheel to go around the issues it makes for you.
Which also means you have to waste tons of times in learning about the issues. Smashing your head at them, understand them, and then solve them.

You donā€™t have to literally fight against any other engine I have ever used.

For me, this ^^^.

The one issue I DO regularly have is some kind of memory leak (or apparent aggressive caching hoarding of RAM) where the RAM footprint of an editor session keeps increasing. ULTIMATELY it will run me out of RAM, but rarely have I had it crash on me otherwise.

As for gaming, yeah, Iā€™ve had to learn how to really get the most out of it, but Iā€™m lovinā€™ it.

It seems to be more of a subjective experience rather than the truth. For me personally, Unreal Engine has done all you are saying. They solved rendering for me, material creation and editing, particle systems, physics, editor environment for advanced level assembly, node based scripting language, and much more.

I canā€™t even imagine doing all that myself at anywhere close to such quality. Especially not in a single lifetime. And especially not if I also wanted to use it to make a game in the end.

The only thing that requires a little bit of bending is the game framework, but people keep forgetting itā€™s a choice. You donā€™t have to use it and can make it all yourself using your own base actors just placed in the level, as youā€™d do in other engines.

Insanely, I get to use all that completely for free, not having to pay a dime until I exceed an earnings bracket that I could never even dream of.

Iā€™ve tried other engines, but none ever came close to how many problems theyā€™ve solved for me compared to UE.

If you loathe UE to the point you are willing to say that it ā€œdoesnā€™t really offer those benefits at allā€ instead of something more realistic, such as ā€œit doesnā€™t offer all of those benefits at the level one would expectā€, then why are you still here? I am a bit confusedā€¦

2 Likes

Whatā€™s confusing about this engine being a total pile of :poop: as of .22 and upwards?

Who said ā€œIā€™m still hereā€? All my important projects were jumped to CryEngine months ago.
And it makes Epic look like the QAless chumps they are.

Unfortunately, knowledge acquired about this pile of :poop: is nearly unforgettable.
And thereā€™s always some hope that epic finally decides to hire someone competent enough to finally add a proper QA pass to the releasesā€¦

Of course, that would literally mean that everything past .18 or so has to be scrapped and re-workedā€¦ so the egg heads above him would probably say ā€œnoā€.

Ue5 was their chance to do this.
They failed miserably yet again.

Epic as far as a company has quite possibly the worst track record in existence with a LOT of things.
Client satisfaction, Community interaction, taking feedback, taking bug reports.
Etc.

You donā€™t have to go far. They sue little kids for cheating at frotnite. They steal other peopleā€™s stuff and get sued for it too.
They pick (arguably rightful) fights with Apple over distribution rights so that the future of your project on IOS is in constant jeopardy. Shall I go on?

You saying that ā€œfor you it solved all the problemsā€ means that you have never even attempted to release a game to date, or that all you built was some candy crush clone that can barely run over mobile.

TL:DR
If you are ok with Mediocre Unreal is a good engine.

If you are looking to make real project (the revenue of which would exceed the free licensing in a minute) then unreal is not the right engine for you. Or anyone.

Itā€™s not by chance that game companies make their own version of enginesā€¦

And with regards to your comment

You do realize that this is exactly what most game companies due all the time right?

Source, source 2. Halfle2 to Dota2
Naughty dog engine
Anvil
Foundation

I could go on. Basically almost every engine is made specifically for a game.
And/or had development directly coincide with a game.

Just because thatā€™s not something you are willing to do, it doesnā€™t mean that you would need more than one lifetime.

If a metaphor is allowed:
Unity might be like the Ford Taurus of game engines. Well known (at least in the US,) reasonable second hand value, easy to find a mechanic when it breaks.
Unreal might be like the Lamborghini of game engines. Very fast, very sexy, but breaks more and harder to fix when it does.

Now, is the argument that a Taurus is the right car for everyone? I think that would be a dumb argument. Different people prefer different cars. Different developers need different engines.

And, honestly, I feel Unreal Engine 4 is more like a BMW ā€“ maybe less reliable than a Toyota Camry, but not at the trouble level of the Italian supercars. For that, you need the pre-release UE5 download :slight_smile:

I feel similarly, unreal was good when i joined for the first time, but after i release my first game, im jumping over to godot. Am i making a mistake choosing godot? I dont expect, need, or even want the best graphics, what i do want is an engine that wont randomly tank performance or randomly remove vital features.

1 Like

Godot seems pretty solid to me.
Of course you end up having to do most things yourself - but for my projects that was the case with unreal too, so it didnā€™t really matter.

BTW, when you folks up there in the wee early posts say ā€œphotorealismā€ā€¦
Do your even realize that currently if you shadow 1000 cubes you get 14fps or so? (~2k 1080ti overclocked)

Those cubes have 6 faces. Each face has 2 tris.
Do the dman math!

Assume that a half decent (speedtree) tree has likely 1500 to 3000 tris.

How many trees can UE4 support before you can right click the project folder and just delete it???
(2ND Grade level algebra right there).

Now go off and do 1000 lit cube test across the other engines.

Based on that, you know which one has a better baseline rendering thread.
(Which could mean nothing at all if your settings for the initial tests are off too. The scene / pp stuff and engine setup have to match. Donā€™t go testing raytrace on ue4 and comparing to regular render in unity for instance )

There must be some other assumption in that statement, not stated.

Assuming you use shadow maps, the number of objects almost doesnā€™t matter at all. Maybe if you simulate them all, and they all are close to each other, and you run in debug mode, physics will suck up a lot of performance? Maybe if you create a separate material for each of them, rather than using a material instance, the state changes end up costing a lot? I bet if you tesselate each cube three times, making 128 triangles per side, instead of 2, you will see almost no change in fps ā€“ itā€™s very unlikely the triangle count is the actual problem.

But, as I said ā€“ for that number to mean anything, a lot more variables need to be controlled for. (You mention that that needs to be done when benchmarking against other engines, but Iā€™d like to see what causes 14 fps for you.)

I put together a simple scene ā€“ a ground box, 1024 boxes, physically simulated, with cascade shadows on them. (Iā€™m using Force No Precomputed Lighting, in 4.26.2)
This scene runs in 2k in the editor at 90 fps while the objects simulate, and 120 fps when the objects come to rest.
I didnā€™t even package it up ā€“ clearly, something else is going on in the ā€œ14 fpsā€ number.

(Yeah, when the editor is switched to the background to take a screen clipping, the FPS is ā– ā– ā– ā– , as can be seen ā€“ youā€™ll have to trust me it runs at 120 fps :smiley: )

I packaged it up and ran it on a 1440p GTX 1080 Ti. Ran at smooth 60 FPS ā€“ the display there doesnā€™t go faster, and I didnā€™t bother to turn off vsync. GPU load was ~70% according to nvidia tools.

Many times the problem is that you have not chosen the correct engine for your game and of course the fault is not yours is the tool, an inert and lifeless being.

So this type of post always falls into some of the following categories:

  1. The game you are trying to make is beyond your capabilities.
  2. You want to do a pong and you donā€™t need the framework. (Even if your game has a game mode, players, a camera, and a controller)
  3. Unreal does not use C #.
  4. Your computer is a coffee maker that doesnā€™t even move paint.
  5. You want to make a huge open world.

No. No shadow maps. Games, so fully dynamic.

Actually no.
Neiter chaos build nor PhysX from launcher have physics issues with simulating 1000 cubes. (Then again Iā€™m on an I9, I wouldnā€™t really expect a performance issue on the CPU side).
To test it, turn off shadow casting on a directional light.

No. But they arenā€™t supposed to be instanced. Ofc, isntancing would cost less.

No. Not with a dynamic lights. The cost is Per tris, per light. So more tris = grinding halt.

I donā€™t think I had anything particular going on last time I gave this a whirl.

Standard 3rd person template. Pop in a cube. Set to simulate physics. Set Movable.

Alt drag to make copies up to 10 vertically.
Alt drag to make 20/30/40 so on up to 100. Place them a bit randomly if you will.
Select all then alt drag them all to make 200/300 etc. To 1000

Randomly placed each time and with overlaps to cause physics to you know, atrually test. Given initially chaos was doing very poorly with high numbers of simulating bodies.

No need for forcing no precomputed light on it either, the cubes are movable.

I doubt I had other windows open, since it was so bad I filmed it last time, closing all windows is the first thing Iā€™d have doneā€¦

And yes it was obviously very bad and just due to the shadows.
Disabling shadows jumped right up to 120fps.

Either way, no other engine reacts so poorly in dynamic lighting situations for simple cubes so far.
And .20ish didnā€™t bat an eye / ran over 80fps np problem on all those stress tests way back when.

But unreal 3/4 has never been for those parties of lights and dynamic objects, it has always been for rooms (even with trees and rivers) and ligmaps, thatā€™s why CryEngine maintains its niche for open worlds.

Thereā€™s something about words here.

Light maps are static. Shadow maps are dynamic. Theyā€™re the ā€œSMā€ in ā€œCSM,ā€ for example.

Thatā€™s not what Iā€™m seeing. Changing the cubes to spheres has no impact for me.

[quote]no other engine reacts so poorly in dynamic lighting situations
[/quote]

And neither does Unreal Engine. Thereā€™s something different between your scene and my scene.
Run it yourself: https://watte.net/SimpleShadowTest1000Cubes.7z [132 MB]

Thereā€™s no precomputation here ā€“ thereā€™s an actor that spawns 1024 cubes in On Begin Play, offset a bit so they actually do the bouncing-arounding thing.

Edit: Actually, I updated it all to be a little more packaged, for anyone whoā€™s interested and want to benchmark themselves. I updated the executable above, and also, hereā€™s the project (all blueprint):
https://watte.net/SimpleBenchmark_UE4_26_20210911_Project.7z [63 MB]

Back 10 years ago we had issues with dynamic shadows not culling out like they should, it showed them culling but in reality they were not being culled and were being drawn when they should have been culled out. That was causing terrible fps. That was with speed tree dynamic shadows NOT CULLING so we quit using them and turned our trees in to static meshes and put them into the foilage editor to solve that issue. If you set it to cull at whatever distance you will see it looks like they are culling at that distance, but everything that is not showing is still being rendered you just do not see it, basically what we came up with was lets draw all shadows complete a the way out even if you se them culling at 500 they are not. It is still drawing everything all the way out. We did some tests way back (10 years ago) to prove it.

Is there still a dynamic culling issue going on?

How we tested that was turn on all dynamic shadow all the way out no culling, check the fps , then set it to cull at 500 check your fps is it the same or did it get better with culling? ours stayed the same. we had same fps wether we had culling on or were drawing all the way out. Told me that they were not being culled. turn them off completely watch the fps go thru the roof.

My scene has all dynamic shadows (shadow maps, cascaded shadow maps.)

Iā€™ll give it a whirl.
Just re-installed my work rig, so itā€™ll take a bit.

Maybe itā€™s because of how they are spawned under the hood that you get better performance?
Instead of instance copy/paste?

And yea, I red that as baked light map rather than ā€œshadow mapsā€.
Doesnā€™t really change too much in the sense that the items being set to movable prevents the light from baking anyway so I guess itā€™s a moot point.

Thereā€™s a starting cost for the initial render, and an additional to every tris. Itā€™s not a 1:1 ratio (though according to epic docs it seemed like a direct ratio.
I read this just last month and I canā€™t find the in question via Google, so It was likely edited? O_o)

Also performance wise Directional Lights and PointLight have some wierd differences.
If you were to make your pointlight large enough to shadow all items the same, youā€™d expect same coats. but itā€™s not soā€¦ because you know, unreal consistency? :stuck_out_tongue:

You also arenā€™t really looking at all the spheres. Thereā€™s maybe what 100 on screen and most of the rest is culled out and/or has no need to render a shadow?
Freeze rendering and see what was actually being computed.

Perhaps the difference was that, the way I test it all the cubs are on screen at the same time, thereā€™s no culling involved at all.

Either way Iā€™ll give a look at your shared thing.

Iā€™ll give that a whirl too.

There wasnā€™t one in .25
Basically setting the CSM to something stupid small (1.5m? I think?) Was the only way to get around 60fps steady on a heavy scene.
But the distance range was definitely affecting performance, so if now itā€™s not thatā€™s a regression.

Getting very low fps from many shadowed objects can mean that you just configured your light incorrectly. Have you set your directional light to movable or stationary? If itā€™s stationary, make sure the CSM distance is not zero and inset shadows for movable objects is disabled.

Have you seen how sexy and fast Unity HDRP with RTX really is? Makes me think of Unity as like a 800hp Subaru STI against your Lambo. Boils down to the skill of the driver. But a Subi costs way less, and maintenance is easierā€¦

I was absolutely looking at all of them at the same time. Theyā€™re slanted on spawn specifically so I can see all of them. Not to mention: The shadow map cost is based on culling from the point of view of the light, which, being a directional light, sees all the objects. Objects you canā€™t see (because they are behind other things,) still can cast shadows you can see, so the engine canā€™t not draw those objects in the shadow map.

The very clear correlation was ā€œnumber of physics objects simulatingā€ versus ā€œframe rate.ā€ With all objects simulating, it was at 80 fps or so, when all of them come to rest it tops out at the rate of the display.

Number of triangles is a very weak correlate to performance on any modern graphics in my experience ā€“ all the cost is in the setting up / coordinating the drawing of each ā€œbatch,ā€ and the graphics cards are just SO GOOD at drawing triangles within each ā€œbatch.ā€ Especially if the triangles cover the same number of pixels in the end, most typical game scenes will not be triangle count limited. (And then, with Nanite, that becomes double true ā€¦) (But then, ā€œtypicalā€ varies for each creator.)