The biggest blocker of them all - Memory

The biggest blocker in all of this ecosystem’s history, going back to 2019 is memory. Each era of content was determined by how much memory we had and what optimization tools we were given to better optimize it. This still applies, as UEFN has unlocked a multitude of new tools and allows for the scope to be increased, and memory recently also received a patch that made actor and device cost less.

That is all great, but then 2 updates later the bugs started again, massive increases in memory when world streaming was used, and now UEFN doesn’t have the optimization tools that we had in Creative to know what causes the peaks or try to debug. Also, you can’t see memory in UEFN right now, because the bar only shows you the memory cluster you are currently standing on, not the highest one. So if you want to plan your features and scope, you can’t really. Devices take overall memory because of their fancy meshes, overriding world streaming because they include functionality. This messes up with the scope of games, because any attempt to make a complex game will quickly hit the memory limit due to the combination of actor # and devices needed to appeal to a complex world. Verse partly solves that, but it’s not competent to handle multiple device behaviors on its own yet. We are also left with a memory calculation tool that takes 20 mins to complete on complex maps, whereas in Creative it was instant, which is a huge waste of dev time.

We need all of Creative’s optimization tools, and devices to not count over memory (ditch their meshes if you have to, we need the functionality to work not their looks) and honestly we need to stop having memory be the blocker. Even in Creative, it killed one of my largest projects and all it took was one update that makes memory calculate differently, could have been a bug too who knows. But it’s another huge risk for our development to have thousands of assets tell you if you can publish or not, because even if one calculation breaks out of those thousands, you are locked from continue working or publishing. It should be a decision of a benchmarking tool, not an arbitrary number.

@Wertandrew Thank you for your feedback. While I cannot guarantee a response, I can confirm that this has been forwarded to the appropriate team.

I acknowledge that working across all devices is a big part of Fortnite’s ‘magic’.

Hypothetically, If you could raise the memory cap, but only publish for PC/Consoles, would you?

Well, depends on my target audience. We don’t know what audience plays what atm, and to me, I love to make multi-platform games. I would rather prefer to exclude one platform than target only one, because if taking off Switch means another 200k, then I would be tempted if my audience niche isn’t on that platform.

1 Like

I remember when in Creative 2.0 a few updates ago, things started to cost less memory which took my map from 109,000 to about 60,000 so I added more to it.

Later on the memory usage jumped up again, but I could return to lobby, go back in my map and it would be right.

Maybe three updates ago memory usage is now over 115k and nothing I do short of deleting things will bring it down.