On their official faq they mention that the planned release date is early 2022: Frequently Asked Questions - Unreal Engine
Yes of course, I would never spread lies… I’m not that type of people, I love Epic and UE, I´m only concerned about the final release, project delays, and here is the real source of this information from May 26 2021, based on an interview with Chance Ivey and Galen Davis of Epic Games:
Thanks for posting. Still can’t view it though. Something changed in the last year (since the switch-over to discourse?). Now most Epic web-pages including the Marketplace don’t load properly. The reason it would appear, is that they’re now saturated in new trackers. If specific cookies / domains are blocked, core content on pages stays hidden. Seems like its NOT tied to browser / device. Unfortunate PITA, but still not letting all trackers through.
I just hope its optimized.
For what it’s worth, the trackers on the Unreal sites seem to all be to figure out how you use the site, how the site performs for you, and general demographics information such as “where are you using this” and such, all of which are legitimate needs for the engine developers and for those running the Unreal web sites. It’s not like they’re laden down with pre-roll video ads or malware cursor pack downloads …
If you don’t want them to have this information (which absolutely helps them do a better job supporting us all) then feel free to cut those off, but then you also give up the expectation that the site will actually, you know, work well for you… That trade-off is of course different for every person.
Main point: Just because it’s called a “tracker” doesn’t necessarily mean that it’s going to be used to your own detriment. In fact, people who run websites, need to know who uses the website, and how it works for them, to be able to run those sites better.
Maybe / maybe not. Gotta ask, what’s that assessment based on / how much of it is assumptions? Historically Epic / Tim Sweeney have been good about non-invasive tracking. But not sure how long that will last, especially if Tim retires (same goes for Tim@Apple btw). There’s a whole industry devoted to tracking that’s out of control. You only have to look at this, and ask how long til that nightmare comes West? A lot has changed since this wider discussion (remember any of it? ). We all live in an episode of Black Mirror now.
That may be your assessment. Having operated web sites, including heavily ad oriented ones, for the better part of the last 15 years, and seen this “tracking” from the inside, I find it to be much more pedestrian and much less sinister.
If all of the West suddenly votes populist-totalitarian and empowers the secret police to come for your web records when you search for the wrong kind of literature, then I shall have been proven wrong, but … let’s say, I’d be willing to take pretty long odds on that actually happening
Anyway, my assessment of the EPIC website is by comparing their list of beacons in my tracker-tracker to what a reasonable web developer would use. And, generally, I prefer to let web developers get the information they need to operate the site better.
However, I much disklike loud advertising that’s trying to tell me that I’m inferior and only buying a new shiny product will make my soul whole. That ■■■■’s corrosive! And also, a real threat to our well-being as humans! But also, not something you find on the Unreal developer sites.
What you mean, Is that we should avoid starting our projects with UE5 early version 2? we won’t be able to convert those projects to UE5 final version?
We should start our UE5 projects on UE4.26, right now we won’t be able to use Nanite, Lumen, MegaAssemblie or any new feature… But when UE5 is released we will be able to upgrade only from UE4.26-27 and use all UE5 features, but if we use UE5 right now, since it is an early version, we won’t be able to convert without bugs.
I haven’t said that. UE5.0ea projects are going to be convertible to UE5, this is the official info.
You did not read the documentation.
This makes no sense. It should literally be the other way around.
Just to summarize there are 2 paths:
- UE26 → UE5EA → UE5
- UE26 → UE27 → UE5
The only thing that you can’t do is upgrade from UE27 to UE5EA, those are not going to be compatible at all.
Isn’t there 4.28 coming too?
AFAIK there is no mention about 4.28 anywhere.
What I said is taken from their own documentation: Welcome To Unreal Engine 5 Early Access | Unreal Engine Documentation
Lots of hints about 4.28:
why would assume then that it has the same rule as 4.27, it will will not be compatible with early access version but you can safely assume that you will be able to update to ue5 final release
So, we should not use UE4.27 and up, we should stay with UE4.26.2, at least if we want to upgrade our projects to UE5 when released in 2022 and avoid compatibility issues, that’s basically it?
I do think UE4.28 is coming, and even UE4.29 if UE5 is delayed again, I don’t think anybody was expecting a UE4.27 version last year.
Actually I wish I could start with UE5EA right now… but not any marketplace product, asset or plugin has been updated to UE5, I have asked several developers and they all tell me that they are only updating their assets when UE5 is officially released, not EA, and at this point I can only guess April 2022, I still think this is not good news; what would happen if we start a UE4.26 project with Market Place products and since no developer is updating anything at this moment, when trying to upgrade to UE5 (2022) we can’t, because of a plugin issue, materials not working, the Grooming system is reworked, or you paint a landscape for months and the new UE5 system is completely different… people are already having issues when trying to convert UE4 projects to UE5EA, even more if they use market place assets, just search it in the forums.
Polycount has never been an issue with LOD´s, and even if UE5 can hold 1 TB assets, you are never going to release a game with more than 80GB… unless you are really nuts, so the nanite tech is great, not the next revolutionary thing, most computers don’t even have more than 1TB in free disk space, Oculus 2 is 50GB, and you are releasing a 3TB game filled with mega assemblies and non optimized assets; I don’t think LODs are ever going to be replaced, Nanite it is a great feature and I love it for detailed assets and cinematics… I think this feature has more future for the film industry, not games; anyway foliage is far more important than rocks.
I think a great destruction and dynamics system would be the next thing for games, honestly… and even more for the film industry, like real water simulations, destruction without FPS drops, but nothing is guaranteed right now, I just wish chaos had been better, FPS is terrible right now, but I still believe, and hope that the final version has no previously announced features and I know it will, for now, the one thing I feel is fantastic about UE5 is lumen, I think lumen is beautiful, dynamic and hyper realistic.
You can safely use UE4.27 and then update to UE5 next year, you just can’t update to Early Access version if you choose to start with UE4.27.
Nanite isn’t really that much more expensive (memory wise) than standard quality assets. LOD and current poly count is not OK, is actually terrible, Nanite is the bigger revolution for a lot of reasons and should also support destruction. Also consider that this is just the start, I think we will have a lot of changes on how we think about geometry. On the other hand, I am much more skeptical about Lumen, don’t get me wrong, RayTracing is the future, it’s just Lumen implementation that doesn’t fully convince me, I’m not sure is the future of RT tech.
I´m not talking about Nanite by Nanite sake… but the fact of importing huge assets, like massive useless statues with million of polygons to UE, it makes bigger project sizes, and people is focused now on… “wow! Nanite allows me to be able to do this, import huge assets with millions of polygons, coool”… but honestly… you shouldn´t upload large assets just because you can,
your projects should actually be created consumer wise, and consumer wise 100 GB games are a bad idea already, but making 1 TB games is the worst idea, just the UE5 default project is 100GB… and it is not even a game, just a level, imagine making that into a full game with 50 to 100 levels, it would be impossible for an average client to even install it. so Nanite is great, not really a revolution, lumen is, and it is fantastic, static lights have always been a problem in the past, now lumen is beautiful.
Dynamics, destruction and simulation, that’s the revolution games need, right now not even close.
You are assuming a lot here, it’s a new tool and people are messing arround with it, doesn’t looks like a problem to me. If you don’t need assets with millions of polygons, just don’t do it. Nanite has better culling, better overdraw, better materials management and can handles way more instances that was previously possible, so it’s not just about the number of polygons. By the way Lumen is only possible because of Nanite, take that away and Lumen goes with it. Did you watch any of the videos Epic made on the new features? Because it doesn’t looks like it.
On the matter of size, yes you have more polygons, but you don’t need other stuff, like 4k normal maps that also consume memory. The compression system of Nanite already does a good job and is bound to improve, at this time having a million of polygon is not that expensive memory wise. Also considering that the project size is bigger that the release for the customer, one reason is because you not only need the Nanite version, but also the original raw data whitch is even bigger. In the Valley of the Ancient, Epic also said that most of the space is occupied by textures and it is a demo, unoptimized is many ways, just check the video for the explanation. The final released package is 20 GB not 100. Your 1 TB size for a normal game is completely made up.
When I said I wasn’t fully sold on the Lumen feature I meant on the technical side, the outcome is clearly something we all want and yes, it is a revolution, but the way they achieved it, I don’t know, it seems to have a lot of exceptions and limitations and goes in a different direction than the HW manufactures. I’m just a little bit more cautious to belive that this solution is generalized enough to scale into the future. I’m waiting to see what happens when they get a decent HW acceleration support and I wanna see how they handle high quality reflections.