I haven’t said that. UE5.0ea projects are going to be convertible to UE5, this is the official info.
You did not read the documentation.
This makes no sense. It should literally be the other way around.
Just to summarize there are 2 paths:
- UE26 → UE5EA → UE5
- UE26 → UE27 → UE5
The only thing that you can’t do is upgrade from UE27 to UE5EA, those are not going to be compatible at all.
Isn’t there 4.28 coming too?
AFAIK there is no mention about 4.28 anywhere.
What I said is taken from their own documentation: Welcome To Unreal Engine 5 Early Access | Unreal Engine Documentation
Lots of hints about 4.28:
why would assume then that it has the same rule as 4.27, it will will not be compatible with early access version but you can safely assume that you will be able to update to ue5 final release
So, we should not use UE4.27 and up, we should stay with UE4.26.2, at least if we want to upgrade our projects to UE5 when released in 2022 and avoid compatibility issues, that’s basically it?
I do think UE4.28 is coming, and even UE4.29 if UE5 is delayed again, I don’t think anybody was expecting a UE4.27 version last year.
Actually I wish I could start with UE5EA right now… but not any marketplace product, asset or plugin has been updated to UE5, I have asked several developers and they all tell me that they are only updating their assets when UE5 is officially released, not EA, and at this point I can only guess April 2022, I still think this is not good news; what would happen if we start a UE4.26 project with Market Place products and since no developer is updating anything at this moment, when trying to upgrade to UE5 (2022) we can’t, because of a plugin issue, materials not working, the Grooming system is reworked, or you paint a landscape for months and the new UE5 system is completely different… people are already having issues when trying to convert UE4 projects to UE5EA, even more if they use market place assets, just search it in the forums.
Polycount has never been an issue with LOD´s, and even if UE5 can hold 1 TB assets, you are never going to release a game with more than 80GB… unless you are really nuts, so the nanite tech is great, not the next revolutionary thing, most computers don’t even have more than 1TB in free disk space, Oculus 2 is 50GB, and you are releasing a 3TB game filled with mega assemblies and non optimized assets; I don’t think LODs are ever going to be replaced, Nanite it is a great feature and I love it for detailed assets and cinematics… I think this feature has more future for the film industry, not games; anyway foliage is far more important than rocks.
I think a great destruction and dynamics system would be the next thing for games, honestly… and even more for the film industry, like real water simulations, destruction without FPS drops, but nothing is guaranteed right now, I just wish chaos had been better, FPS is terrible right now, but I still believe, and hope that the final version has no previously announced features and I know it will, for now, the one thing I feel is fantastic about UE5 is lumen, I think lumen is beautiful, dynamic and hyper realistic.
You can safely use UE4.27 and then update to UE5 next year, you just can’t update to Early Access version if you choose to start with UE4.27.
Nanite isn’t really that much more expensive (memory wise) than standard quality assets. LOD and current poly count is not OK, is actually terrible, Nanite is the bigger revolution for a lot of reasons and should also support destruction. Also consider that this is just the start, I think we will have a lot of changes on how we think about geometry. On the other hand, I am much more skeptical about Lumen, don’t get me wrong, RayTracing is the future, it’s just Lumen implementation that doesn’t fully convince me, I’m not sure is the future of RT tech.
I´m not talking about Nanite by Nanite sake… but the fact of importing huge assets, like massive useless statues with million of polygons to UE, it makes bigger project sizes, and people is focused now on… “wow! Nanite allows me to be able to do this, import huge assets with millions of polygons, coool”… but honestly… you shouldn´t upload large assets just because you can,
your projects should actually be created consumer wise, and consumer wise 100 GB games are a bad idea already, but making 1 TB games is the worst idea, just the UE5 default project is 100GB… and it is not even a game, just a level, imagine making that into a full game with 50 to 100 levels, it would be impossible for an average client to even install it. so Nanite is great, not really a revolution, lumen is, and it is fantastic, static lights have always been a problem in the past, now lumen is beautiful.
Dynamics, destruction and simulation, that’s the revolution games need, right now not even close.
You are assuming a lot here, it’s a new tool and people are messing arround with it, doesn’t looks like a problem to me. If you don’t need assets with millions of polygons, just don’t do it. Nanite has better culling, better overdraw, better materials management and can handles way more instances that was previously possible, so it’s not just about the number of polygons. By the way Lumen is only possible because of Nanite, take that away and Lumen goes with it. Did you watch any of the videos Epic made on the new features? Because it doesn’t looks like it.
On the matter of size, yes you have more polygons, but you don’t need other stuff, like 4k normal maps that also consume memory. The compression system of Nanite already does a good job and is bound to improve, at this time having a million of polygon is not that expensive memory wise. Also considering that the project size is bigger that the release for the customer, one reason is because you not only need the Nanite version, but also the original raw data whitch is even bigger. In the Valley of the Ancient, Epic also said that most of the space is occupied by textures and it is a demo, unoptimized is many ways, just check the video for the explanation. The final released package is 20 GB not 100. Your 1 TB size for a normal game is completely made up.
When I said I wasn’t fully sold on the Lumen feature I meant on the technical side, the outcome is clearly something we all want and yes, it is a revolution, but the way they achieved it, I don’t know, it seems to have a lot of exceptions and limitations and goes in a different direction than the HW manufactures. I’m just a little bit more cautious to belive that this solution is generalized enough to scale into the future. I’m waiting to see what happens when they get a decent HW acceleration support and I wanna see how they handle high quality reflections.
Because nobody understands what Nanite actually does or why it’s important, here’s a quick summary of it using snippets from the Nanite Documentation:
In most cases Nanite scales extremely well with screen resolution. It does so based on two key techniques: fine-grained level of detail and occlusion culling [which people completely ignore]. Typically this means, regardless of the geometric complexity of the source data in the scene, the number of triangles Nanite attempts to actually draw to the screen is fairly constant and proportional to the number of pixels. Nanite follows the design principle that there is no use in drawing far more triangles than there are pixels.
Put simply, Nanite tries to draw only as much triangles as there are pixels. In addition to that:
Level of Detail (LOD) is automatically handled and no longer requires manual setup for individual mesh’s LODs
Data is streamed in on demand so that only visible detail needs to reside in memory.
Because of that (this is why it’s important):
Frame budgets are no longer constrained by polycounts, draw calls, and mesh memory usage
Put simply, game performance is no longer tied to the complexity of the scene.
It’s important to understand that polycounts don’t apply only to individual models, but also to the whole scene itself, and a scene comprised of only game-ready models can still reach millions of triangles easily.
The presentations for both Nanite & Lumen are avaialble here and go through how each system works in detail.
If you are a serious developer you will keep your polycount low… if not, all your projects will be too large, Fortnite is already around 50 GB… imagine they same game without proper optimization… I’m not installing a 500 GB game… Never; but lets say my PC can handle files soo large, regular people with a 1TB laptop won’t, how successful is your project going to be?
The final goal for Nanite regarding bandwidth and scale of the targeted plattforms goes from phones/tablets to filmproduction.
And there are already games with more than 150 GB outthere.
There are things like total install-size but also the needed latency and bandwidth of the storage.
Yes i think there is almost no sense in 8K textures if the target plattforms mainly consist of FullHD users, but for UHD/4K players the game could look underwhelming if you top out too lowres.
So every scope of targets has useful maxima and minima.
If this happens anytime soon it may actually be a huge boost for things like Cloud gaming.
Will gamers get tired of fighting download speeds / stupid hardware prices / crypto wars?
Did you read our previous reply to this? Because it doesn’t seems so. It seems you just decided that you have to stay on low poly no matter what. As already said by multiple people, Nanite is good even for lowpoly.
Yes, I get it, that Nanite is good for low poly… obviously, I was not talking about visuals and performance… I was talking about project size and build size (packaging).
I still remember when I downloaded FIFA and it took like 4 days and my Internet connection was actually great, but that’s also not a problem… LOL the problem is that you are not only targeting people with super computers when you develop a game, and most people play games from PC’s with basic gaming specs… not everybody has access to RTX (specially now that prices are up to the roof… crazy), basic computers might just include 1 TB or less, and in terms of laptops for most cases you can not increase your drive space as easy, others can´t or you won’t do it just because you wanna play a game LOL, just imagine how many people won’t buy your game because they just don’t have space to install it…; it actually happened to me a long time ago, I used external drives and they do not always work, sometimes is hard to update, etc. EVEN… UE4 uses a cache file in C: (DerivedDataCache) and it is impossible to change it to E or D (where normally you have more drive space), and that C: cache folder might be 150GB or more, just from working with UE projects, and if you delete the cache you have to compile every project again, it might be around 1 to 3 hours per project.
Developer wise, working with large individual assets with no optimization is not a good idea, right now, AAA companies are still working as optimized as before, and they keep requesting LODS or other types of optimization, even some that are already working with UE5 and I talk from experience, not guessing.
Anyway, I wanted people to make list requests for UE5, and nobody is sharing their Wishlist’s, please do!
- 4.18 like engine STABILITY… So Indies can be 100% productive!
- Large-Worlds support / Doubles. In progress atm, but no ETA!
- Scripting with fully-integrated environment not MS/VisStudio.
- Blueprints in TEXT form + interop / interchange with Scripting.
- Planets / Spherical-Terrain / Runtime-Terrain but 100% built-in.
Nanite has already a good compression system where even millions of polygons do not take much space compared to 4k normal maps for example and they already said it will improve. The problem is real for developers at the moment where you need to have different versions of the same asset. I also expect a big evolution around geometry, similar to what happened to textures over the years, people will find smart ways to add detail. It’s just too soon to judge right now, but it looks promessing to me.