Maybe / maybe not. Gotta ask, what’s that assessment based on / how much of it is assumptions? Historically Epic / Tim Sweeney have been good about non-invasive tracking. But not sure how long that will last, especially if Tim retires (same goes for Tim@Apple btw). There’s a whole industry devoted to tracking that’s out of control. You only have to look at this, and ask how long til that nightmare comes West? A lot has changed since this wider discussion (remember any of it? ). We all live in an episode of Black Mirror now.
That may be your assessment. Having operated web sites, including heavily ad oriented ones, for the better part of the last 15 years, and seen this “tracking” from the inside, I find it to be much more pedestrian and much less sinister.
If all of the West suddenly votes populist-totalitarian and empowers the secret police to come for your web records when you search for the wrong kind of literature, then I shall have been proven wrong, but … let’s say, I’d be willing to take pretty long odds on that actually happening
Anyway, my assessment of the EPIC website is by comparing their list of beacons in my tracker-tracker to what a reasonable web developer would use. And, generally, I prefer to let web developers get the information they need to operate the site better.
However, I much disklike loud advertising that’s trying to tell me that I’m inferior and only buying a new shiny product will make my soul whole. That ■■■■’s corrosive! And also, a real threat to our well-being as humans! But also, not something you find on the Unreal developer sites.
What you mean, Is that we should avoid starting our projects with UE5 early version 2? we won’t be able to convert those projects to UE5 final version?
We should start our UE5 projects on UE4.26, right now we won’t be able to use Nanite, Lumen, MegaAssemblie or any new feature… But when UE5 is released we will be able to upgrade only from UE4.26-27 and use all UE5 features, but if we use UE5 right now, since it is an early version, we won’t be able to convert without bugs.
I haven’t said that. UE5.0ea projects are going to be convertible to UE5, this is the official info.
You did not read the documentation.
This makes no sense. It should literally be the other way around.
Just to summarize there are 2 paths:
- UE26 → UE5EA → UE5
- UE26 → UE27 → UE5
The only thing that you can’t do is upgrade from UE27 to UE5EA, those are not going to be compatible at all.
Isn’t there 4.28 coming too?
AFAIK there is no mention about 4.28 anywhere.
What I said is taken from their own documentation: Welcome To Unreal Engine 5 Early Access | Unreal Engine Documentation
Lots of hints about 4.28:
why would assume then that it has the same rule as 4.27, it will will not be compatible with early access version but you can safely assume that you will be able to update to ue5 final release
So, we should not use UE4.27 and up, we should stay with UE4.26.2, at least if we want to upgrade our projects to UE5 when released in 2022 and avoid compatibility issues, that’s basically it?
I do think UE4.28 is coming, and even UE4.29 if UE5 is delayed again, I don’t think anybody was expecting a UE4.27 version last year.
Actually I wish I could start with UE5EA right now… but not any marketplace product, asset or plugin has been updated to UE5, I have asked several developers and they all tell me that they are only updating their assets when UE5 is officially released, not EA, and at this point I can only guess April 2022, I still think this is not good news; what would happen if we start a UE4.26 project with Market Place products and since no developer is updating anything at this moment, when trying to upgrade to UE5 (2022) we can’t, because of a plugin issue, materials not working, the Grooming system is reworked, or you paint a landscape for months and the new UE5 system is completely different… people are already having issues when trying to convert UE4 projects to UE5EA, even more if they use market place assets, just search it in the forums.
Polycount has never been an issue with LOD´s, and even if UE5 can hold 1 TB assets, you are never going to release a game with more than 80GB… unless you are really nuts, so the nanite tech is great, not the next revolutionary thing, most computers don’t even have more than 1TB in free disk space, Oculus 2 is 50GB, and you are releasing a 3TB game filled with mega assemblies and non optimized assets; I don’t think LODs are ever going to be replaced, Nanite it is a great feature and I love it for detailed assets and cinematics… I think this feature has more future for the film industry, not games; anyway foliage is far more important than rocks.
I think a great destruction and dynamics system would be the next thing for games, honestly… and even more for the film industry, like real water simulations, destruction without FPS drops, but nothing is guaranteed right now, I just wish chaos had been better, FPS is terrible right now, but I still believe, and hope that the final version has no previously announced features and I know it will, for now, the one thing I feel is fantastic about UE5 is lumen, I think lumen is beautiful, dynamic and hyper realistic.
You can safely use UE4.27 and then update to UE5 next year, you just can’t update to Early Access version if you choose to start with UE4.27.
Nanite isn’t really that much more expensive (memory wise) than standard quality assets. LOD and current poly count is not OK, is actually terrible, Nanite is the bigger revolution for a lot of reasons and should also support destruction. Also consider that this is just the start, I think we will have a lot of changes on how we think about geometry. On the other hand, I am much more skeptical about Lumen, don’t get me wrong, RayTracing is the future, it’s just Lumen implementation that doesn’t fully convince me, I’m not sure is the future of RT tech.
I´m not talking about Nanite by Nanite sake… but the fact of importing huge assets, like massive useless statues with million of polygons to UE, it makes bigger project sizes, and people is focused now on… “wow! Nanite allows me to be able to do this, import huge assets with millions of polygons, coool”… but honestly… you shouldn´t upload large assets just because you can,
your projects should actually be created consumer wise, and consumer wise 100 GB games are a bad idea already, but making 1 TB games is the worst idea, just the UE5 default project is 100GB… and it is not even a game, just a level, imagine making that into a full game with 50 to 100 levels, it would be impossible for an average client to even install it. so Nanite is great, not really a revolution, lumen is, and it is fantastic, static lights have always been a problem in the past, now lumen is beautiful.
Dynamics, destruction and simulation, that’s the revolution games need, right now not even close.
You are assuming a lot here, it’s a new tool and people are messing arround with it, doesn’t looks like a problem to me. If you don’t need assets with millions of polygons, just don’t do it. Nanite has better culling, better overdraw, better materials management and can handles way more instances that was previously possible, so it’s not just about the number of polygons. By the way Lumen is only possible because of Nanite, take that away and Lumen goes with it. Did you watch any of the videos Epic made on the new features? Because it doesn’t looks like it.
On the matter of size, yes you have more polygons, but you don’t need other stuff, like 4k normal maps that also consume memory. The compression system of Nanite already does a good job and is bound to improve, at this time having a million of polygon is not that expensive memory wise. Also considering that the project size is bigger that the release for the customer, one reason is because you not only need the Nanite version, but also the original raw data whitch is even bigger. In the Valley of the Ancient, Epic also said that most of the space is occupied by textures and it is a demo, unoptimized is many ways, just check the video for the explanation. The final released package is 20 GB not 100. Your 1 TB size for a normal game is completely made up.
When I said I wasn’t fully sold on the Lumen feature I meant on the technical side, the outcome is clearly something we all want and yes, it is a revolution, but the way they achieved it, I don’t know, it seems to have a lot of exceptions and limitations and goes in a different direction than the HW manufactures. I’m just a little bit more cautious to belive that this solution is generalized enough to scale into the future. I’m waiting to see what happens when they get a decent HW acceleration support and I wanna see how they handle high quality reflections.
Because nobody understands what Nanite actually does or why it’s important, here’s a quick summary of it using snippets from the Nanite Documentation:
In most cases Nanite scales extremely well with screen resolution. It does so based on two key techniques: fine-grained level of detail and occlusion culling [which people completely ignore]. Typically this means, regardless of the geometric complexity of the source data in the scene, the number of triangles Nanite attempts to actually draw to the screen is fairly constant and proportional to the number of pixels. Nanite follows the design principle that there is no use in drawing far more triangles than there are pixels.
Put simply, Nanite tries to draw only as much triangles as there are pixels. In addition to that:
Level of Detail (LOD) is automatically handled and no longer requires manual setup for individual mesh’s LODs
Data is streamed in on demand so that only visible detail needs to reside in memory.
Because of that (this is why it’s important):
Frame budgets are no longer constrained by polycounts, draw calls, and mesh memory usage
Put simply, game performance is no longer tied to the complexity of the scene.
It’s important to understand that polycounts don’t apply only to individual models, but also to the whole scene itself, and a scene comprised of only game-ready models can still reach millions of triangles easily.
The presentations for both Nanite & Lumen are avaialble here and go through how each system works in detail.
If you are a serious developer you will keep your polycount low… if not, all your projects will be too large, Fortnite is already around 50 GB… imagine they same game without proper optimization… I’m not installing a 500 GB game… Never; but lets say my PC can handle files soo large, regular people with a 1TB laptop won’t, how successful is your project going to be?
The final goal for Nanite regarding bandwidth and scale of the targeted plattforms goes from phones/tablets to filmproduction.
And there are already games with more than 150 GB outthere.
There are things like total install-size but also the needed latency and bandwidth of the storage.
Yes i think there is almost no sense in 8K textures if the target plattforms mainly consist of FullHD users, but for UHD/4K players the game could look underwhelming if you top out too lowres.
So every scope of targets has useful maxima and minima.
If this happens anytime soon it may actually be a huge boost for things like Cloud gaming.
Will gamers get tired of fighting download speeds / stupid hardware prices / crypto wars?
Did you read our previous reply to this? Because it doesn’t seems so. It seems you just decided that you have to stay on low poly no matter what. As already said by multiple people, Nanite is good even for lowpoly.