Unreal Engine is broken, why do people use it and like it?

It’s brilliant, but it’s oh so broken in oh so many ways.

Some things are just insane, like autokey not working. Fundamental things. Nanite is cool and all, but if your basic UI is broken, if basic things like duplicating a map and having the ability to actually duplicate dependencies etc isn’t there, then whats the point?

It would be great if the next release was IDENTICAL to the last release on features, but just a complete consolidation of basic features. You only need to do things once to realise they’re broken. The man hours that go into everyone who uses the software going through the same nonsense over and over again is mind boggling.

I can’t write code and I’ve started learning how to use unreal less than six months ago.

I have a game project that I tried to make in the other U engine but it was not possible to complete in that engine due to performance issues, both in build and in the editor.

Unreal and its robust blueprint system has allowed me to remake the game myself whereas before I had to hire a programmer. I’ve built it better, bigger, and it performs well.

Also, it’s a big open world with dynamic lighting and tons of foliage.

I like Unreal because it has empowered me to make games almost entirely on my own. It has mitigated the technical issues and the many thoughtful tools included in the engine make work fast and efficient.

I think people who have problems are probably having the same problems you’d have with any engine - trying to force it to do something it isn’t designed to do. Same problem I had in Unity - I tried to make it run a large open world 3d game. It just can’t handle that sort of thing. But unreal has been keeping me very productive and happy. It empowers me to be able to make the games I want to make with very little compromise.

For people who think the engine is broken I’d say you probably aren’t as clever as you might think. Because I’m a moron with computers but I am able to create non-trivial games without too much issue. Like, if I can do this - and I can’t even understand much of what is written in this thread - and you can’t, I’d question what sort of game you are trying to make and also if you are being disciplined about following common best practices or not.

Maybe when

You shouldn’t flaunt your ignorance while calling everyone else a moron.
It’s a bit like a monkey tossing :poop: is it not?

Glad unreal works for you.
Currently, it works for literally no other working professional in several different industries, from Architecture to Zoology (goes back to the monkey).

You should also question who is doing something wrong:
You who literally said you can’t understand much of what is written on this thread, or a working professional that gets paid thousands to figure it out?

Also, for anyone else still following.
.27 had a slight improvement in the base rendering cost. It is still around 9ms slower than it ought to be.

You have to build from source as the launcher version is beyond broken.

People aren’t making money with unreal?

I’m sure we are, to some extent.
But only with older versions - circa .18 / .22

Anything “new” is unreliable if not completely broken.

It has to do with rendering parameters;
Unless you plan to release on a system with a maximum of 30fps at 1080p, you are SOL and need to use older engine versions.

Need to actually test, but engine performance on a Switch may make the Switch the only thing currently able to have “new” engine development happen on it.
Ps4 can also be OK having a max of 1080p.

Anything else, as in real development. Including movies. Has been seriously suffering over the past 2 years that Epic has been releasing trash.

I’ve only worked in 4.26. And I came from Unity where I had tried many different versions.

Here with 4.26 I have dramatically better performance both in build and in editor compared to what I was used to. So that’s my point of reference.

Compared to older versions of this engine I can’t say anything. But the performance I am getting seems perfectly reasonable. I have a 1060 card and the game I am making is very graphically intensive. On high settings I maintain 30fps. Anybody with newer hardware than me is well above that. And my game is barely alpha - there is tons of optimization to do be done.

Id expect a real studio with more than one noob employee could do some work to make my game run at 60fps on average user hardware.

I’m only talking PC here. I don’t know anything about consoles. I’m just going off of what I’ve actually seen myself. Every software has it’s shortcomings but I think the doom and gloom here is more a matter of craftsmen blaming tools. Certainly the engine is not broken. I’m not concerned with people doing arbitrary test to show which version gets more fps than the other - I’m concerned with finishing games and for me Unreal has been the most productive tool to that end so far. So that’s why I (who is a people) use it and like it.

And the big thing is that I try to stay well within my scope of competency. If I wanted a AAA looking multiplayer shooter to run at 60fps on xbox 360, I’d first expect to have millions of dollars and a major team.

Look, if it works for you great.

That must mean you are one of the 10 people targeting 1080p instead of 4k.

Good luck to you making any sales “for pc” with that as the baseline.

most pc gamers dont even have gaming pc’s. The majority audience is not going to have 4k monitors in the next five years.

Again, I have to ask the question, if you think unreal is broken, what sort of games are you trying to make exactly?

First of. I want to make sure you realize how silly this statement is:

BY DEFINITION - a pc gamer will have a gaming PC, regardless of what YOU think qualifies.

The majority audience means absolutely nothing.
It’s whatever your target demographic is.

For almost all of us here - trying to use Unreal of all things - the target demographic is surely not the penniless mobile gamer who can’t afford the latest 3080ti and a 4k screen.

Double down on that, since most of us who work on film actually have to get scenes to run at 8k.
With that in UE4 or 5, that’s not even possible.

Triple down on that, since the engine can’t currently even run an empty scene at 4k 60FPS - let alone the standard (which is above 120fps/hz.)

It varies.

Most of us make high end AAA quality content/games/marketplace plugins etc.

Some of us even work or used to work for AAA companies - where unreal has been put out to pasture for the past year since it has literally become unusable.

Others do cinematic, including but not limited to stuff like GOT and Mandalorian.

This discussion is actually about HOW the engine has become completely unusable for professionals over the past time span - not “i’m an Indy and I know nothing so it works for me”.

Again. It’s great it works for you. Kudos.
Doesn’t change the fact it pretty much doesn’t work for everyone else at the moment.

PS: Let’s at least try to throw less :poop: than monkeys?
There’s really no reason to make silly generalizations that mean absolutely nothing.

PPS:

“other - dotted” is at times already exceeding 1080p.
Naturally different sources will reveal different things.

For games specifically you can refer to Steam sources, which sure. Say that most are on 1080p
https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
248fe2a0dcb9843a0d766e32dff65fa6
However if your idea is to leave ~ 20% of the market to have to down-scale their main monitor to play your game, that likely means you will be dishing out 20% returns.

"This discussion is actually about HOW the engine has become completely unusable for professionals over the past time span - not “i’m an Indy and I know nothing so it works for me”.

Thats not what the title suggest. OP is asking why people like the engine and use it. Cut and dry.

A few people beating the same dead horse over and over isn’t productive. Instead uselessly complaining so much, you should create more solutions. If the engine really is broken, there’s opportunity for very clever people.

Yes, it’s called moving to a better engine, like CryEngine just to name one.

Everyone else that’s attempting to render at 4k has done so. Including big titles with tons of funding / which have made funding history. (https://www.kickstarter.com/projects/1294225970/kingdom-come-deliverance)

I am not sure where you are getting this BS from. What FPS are you getting when you load an empty scene at 4k resolution?

At 4k resolution I am getting about 230 FPS empty scene, and 150 FPS with my game running full bore (RTS game with dozens of units on screen).

That would depends on what GFX you are using.

Surely you won’t get anything at all on a 1060 like the other person suggests he’s using.

A 1080ti should well be capable. But the engine is not.
A 20x should also be capable. But the engine is not.
And and even 3060/70 struggle to say the least
On empty scenes.

Are you serious? Who do you think is developing in 4k on a 1060 card? Or for that matter even attempting to PLAY anything in 4k on a 1060? I wouldn’t suggest playing anything in 4k even with a 1080ti - what newer games can you mention that comfortably run 60+ fps in 4k resolution on a 1080ti?

Your comments are silly, the engine is plenty capable - there is nothing specific to Unreal that is slowing anything down. It’s all about how you are using it… You have full access to the source code and you can basically pick and pull whatever parts you think you want or don’t want for your specific game (not that you really need to do that).

Let me ask you this, what do you claim specifically is the culprit slowing things down?

3060 and 3070 don’t struggle at all with 4k and populated scenes, let alone empty scenes. Are you just trolling here?

Pretty much anything including RDR2.
A really good one with unreal is Outerworlds - it only dips below 60fps occasionally (Wouldn’t work that way if they used a modern engine version).

No one, the guy above was however attempting to say that the engine runs well on a 1060. Which it doesn’t.

It’s not. It used to be before .26 came along.
Now in order to develop something which actually works you have to stick to older versions - .24 is probably also better than .25

Yes there is, lack of quality control on the epic teams part. And a Bad rendering pipeline introduced in .26 which is still present in .27

There’s only about 100 topics describing the 20% fps drop between version. Go look them up.
If I knew what was directly causing it, it would have published a branch for merging back to the engine already.
So would the thousands of others who were affected by the changes, obviously.

3060s struggles. 3070 is a little better - neither of them capable of 144hz really.
That’s probably expected.

But what is the matter here is the performance differential between .25/.26 on the same rig, at the same res, with the same settings. For the same scene.

The newer versions of the engine (now 2 years going) as per OP - suck.
And it doesn’t end there, there’s also billions of broken/unfinished features that were dubbed “finished enough”.
Groom for instance, RTV, proc. Foliage. Etc.

Sorry but RDR2 actually only gets around 30 FPS with a 1080ti. We are talking about maximum settings here of course right? Otherwise you could run ANY game in 4k at 144hz with a garbage card, so it only makes sense to talk about maximum settings.

The guy above with the 1060 actually said he only gets 30 fps and never mentioned 4k by the way.

I am using 4.26 and never noticed anything you are talking about. The FPS numbers I am getting and mentioned earlier (230 FPS empty scene and 150 FPS game running) at 4k resolution are with 4.26.

If that could have been 20% higher I wouldn’t even know, nor care.

I read a thread about the performance hit and I noticed they said it was editor only. The packaged game still ran the full speed. So honestly after a certain point you should only be working with packaged builds anyway. (play in editor is for very early stage of prototyping in my opinion, too many things work differently than standalone/packaged).

When you say develop something that actually works, what do you mean? I have yet to discover anything NOT working about 4.26. I have made a full fledged game from scratch and can’t imagine what else there could be that I am missing or is “broken”.

Really I think it’s a BS point about the difference between .25 and .26. The engine is basically open source - you don’t think some smart ■■■ would have figured out something causing a 20% drop in speed? And let’s say there actually is something - well I imagine it must be for some good reason, those cycles aren’t just evaporating right?

If you hate 4.26 so much then just don’t use it? You can use any version of the engine you want, so is there some specific feature added in 4.26 that you need to use so badly that you’ll accept a supposed 20% drop in frame rates?

I like how now you are mentioning 144hz when you claim new graphics cards struggle.

At 4k resolution you want to get 144 frames, on a 3060? You will need to reduce some settings or have very little going on in the scene/game. This is not a problem with Unreal engine, it’s a problem with your expectations.

I am not “attempting” to say the engine performs fine on a 1060. You can watch me developing and playtesting my game on twitch - it’s running at 40-60fps while I’m streaming and playing in PIE.

And it’s not even very carefully optimized yet.

Very poor craftsmen blame the tools. Instead of running test on empty scenes and complaining about stuff you can’t control , you could be making games, despite whatever “problems” the free engine is giving you. And if you fail to make games and you are blaming the engine… you probably need to get some help.

Pretty sure it does not, but then again It’s been a few years since I ran games on a 1080ti to do anything other than test.
Back from ARK dev actually.

He also claims the engine “is fine” if you want to re-type stuff pointlessly you may as well do it right.

Unless the tools are to blame.
In your case, they certainly are. Kudos for “working around the problem”, However you are just ignoring the fact that 40-60FPS in PIE is literally NOTHING - by default pie is maybe 800x600 of a view port. So already playing in full screen will drop your FPS below what you’d like.
Do a test in a full screen published build.
At 1080p you shouldn’t be completely unable to get 60fps. Yet, you could have 20% more FPS by just using an older engine version than .26.

That’s your problem - as a new comer to the engine you don’t know that it was once actually a lot better.
Do some research on the forums. There’s only about a billion topics complaining about specifically broken stuff between different distros that have never been patched.
Meanwhile everyone with a project who upgraded engine version for a specific need got screwed out of 20% fps amongst other things.

It’s not. It’s packaged build too.
And when targeting older GPUs a 20% drop in performance means literally not being able to publish.
It also appears to have extended to UE5.
There’s currently a note in rendering thread addressing an issue with DX11 drivers - that’s not it either. The issue persists in DX12.
/ That note is literally the FIRST acknowledgment of problems by epic in 2 years.
(link to it, since the rendering forum is being flodded with “KB notes” whatever they are: Tech Note - [Bug] 4.26 D3D11 - Shader creation on multiple threads causes hitches)

It’s 100% not open source.
Can you access the source? yes.
Don’t mix the 2 things. It’s an insult to open source efforts.

Personally, I opted to move to an engine where I accutally get support.
So has pretty much everyone else.
This issue has been IGNORED by epic for 2 years now.
Why would I or anyone else waste time fixing the mess that the EPIC team refuses to take care of?
Refer freely to any of a thousands of topics like this:
https://forums.unrealengine.com/t/looking-for-input-on-learner-progression-for-our-future-ue-community-learning-experience/262563/32

Meanwile, Unity for instance has developers actually answer to bug reports. In public, and provide viable solutions - despite having an inferior company that’s not making billions off of grabage of visual only pixels in Fortnite.

Point in fact, I really don’t for professional projects. At this point I’m also convinced that no one really does. We all stick to other engines or .18/.20 if we really have to use unreal.
There’s some people out there developing in “UE5” - I think 90% of that and related gameplay footage is BS equivalent of GDC preview of pre-rendered VS what you actually get on release day.

There’s nothing going for UE5 yet that would justify wasting the amount of time needed to fix up the engine nightly to justify pushing development onto it.
And according to forum posts on performance of it it’s only getting worse the more features they add - which being it’s not an official final release is also to be expected. That’s a different scale that has to be used to measure the 2.

UE4.27 is “FINAL” as such it shouldn’t have any ISSUE as bad as a 20% FPS drop between an older version and itself. (same for .26).
UE5 isn’t “final” so any metric of any kind is purely speculative and you can indeed choose to “live dangerously” and develop something you’ll never be able to release.

Not for me, no. I have my own .24 version with custom ray marched clouds, custom sky-sphere and updated physx engine version going.
That doesn’t change the fact that there is absolutely NOTHING in .26 or .27 that should justify a 20% frame drop when disabled - OR is there?
I mean, are you suggesting that it’s OK for an engine version to just magically cost you 20% more rendering out of nowhere? Without any support or acknowledgment from official sources? And that an engine DOING this doesn’t just “SUCK” ?

It’s not a problem with my expectations when prior engine versions are capable of this while the current one is not.
It’s you that has wrong expectations thinking that the scene has to be dialed down to run “right”. It’s the rendering pipeline that is causing the drag. NOT your scene.
(when the same scene runs above the required stats in older versions - obviously).

2 Likes

“by default pie is maybe 800x600 of a view port”

i maximized the viewport. and it gets better performance in build. somehow I trust my own eyes more than the one guy on forum who complains on every thread and seems to take offense at other people not having problems.

i wonder if in previous versions of the engine that worked better you were just using less intensive resources because that was 5-10 years ago. Who knows, but there certainly seems to be plenty that you miss.

What part of “same scene different engine” has been unclear so far?

Somehow I trust my own experice as a paid professional in the filed more than that of someone making wild claims without anything to back em up. On a 1060 nonetheless…