Unreal marketplace has gone Nanite INSANE!

I just want to mention ( and whine about ) this here, as I think it’s probably starting to affect a lot of people.

I have brought several products from the market recently where an effort to get poly counts into a sensible area has basically been abandoned, because of Nanite. Nanite is enabled on EVERYTHING regardless of what it’s function is.

Can someone please tell these vendors ( I have ), that just because we have Nanite, your pride in being able to produce a model without INSANE poly counts should not be thrown out the window.

I hear you say “Well, you can just turn Nanite off”. Er - can’t, because these ( small ) models have about 1.5M tris each. Yes you read that right - 1.5 MILLION tris. So if you want to turn off Nanite, you have to spend an hour or so, applying reductions to the models. The pack I got today ( no names ), all of the models could be reduced to < 50k tris without any loss of quality. The bare model data is taking up about one hundred times the amount of disk space it should.

Maybe I also hear you say “Well, everybody can just use Nanite”. Once again - BZZ! - nil points. Up until recently I had a GTX1050Ti, and Nanite made was absolutely no difference whatsoever. In fact, often made things worse. And most game players have card worse than that. So we still have to work with LODs and instancing.

To be clear, I think Nanite is a great idea, and is very close to being widely useable, but it’s not there yet. And people need to know that.

I’d love to hear from anyone else who is experiencing this type of thing…

10 Likes

I agree. Creators can’t throw optimization out of the window assuming some other “magic tool” will do their work. If Nanite is becoming an actual requirement for content sold on the marketplace it should be made very clear on the product page. I’d consider 1.5M tris in a basic model a serious flaw and reason for a refund.

7 Likes

To be fair, it did say ‘Nanite’ on the page. But usually you can just switch it off. Here not. CRAZY modelling. To be clear, we’re talking about 1.5M tris on something to size of a football.

The vendor got back to me and basically tried to say it was normal. Hmmm… ( this is a vendor the works closely with Epic BTW ).

3 Likes

Well if it said Nanite it did say Nanite. It’s still not a good practice to have a ridiculous amount of verts if not required. Those models are difficult to modify and clean up manually and it does matter for performance.

In case it’s a scan, which naturally produces tons of verts, specifically made for Nanite, then that’s also a different situation than someone who modelled a football manually in Blender ;p.

I once refunded a model on another marketplace because it was impossible to clean up. They had made a scan of a creature and did no attempt to clean it up before posting it on their marketplace. It didn’t have a million verts like you’d see on some scientific mummy scan but it was complex enough that it couldn’t be cleaned up manually to prepare it for rigging. Nanite did not exist at the time ;). Given a context like that you can decide if it’s a product flaw or not.

You can luckily request a refund within 14 days and I believe the request is granted even faster if the seller does not reply within time.

3 Likes

It said Nanite, I didn’t say INSANE Nanite :wink:

2 Likes

I think Epic’s point of view is ‘Nanite is all you need, that’s it’. But I don’t know.

Hopefully it won’t become too much of an issue. Because a LOT of people in the community are writing for games and know that most of their players can’t handle too much, so won’t buy the products.

The vendor actually wrote back and basically said ‘this is not many polys, it’s the norm now’. What!?

I’ve PM’ed you the product :slight_smile:

5 Likes

:rofl: I’ll advise the community to add more potato batteries to their systems then. If it’s the norm there are going to be some amazing changes in mobile hardware tomorrow morning.

2 Likes

From the offending package

73408a9564dd4a9b48e66f8725e8ca27e32cf1bf

Er, but, but… it’s a flat square tile… I would recommend 20tris, 250 for a really gnarly chipped tile, maybe 500 for an arch viz game called ‘nose on the tiles’…

And talking of potato batteries, using more disk space = bad for the environment. But that’s not the main issue here.

6 Likes

ffs between 20 and 100 verts would be more than enough for that thing. Add a normal map, displace verts a bit and done. End result would be more performant and smaller than this thing. You can’t call that the norm for a cube. I was expecting some MetaHuman at least. We have tons of free material libraries these days you could just brush paint over a cube and get this result. Definitely do a refund.

1 Like

These are pros, this is a well known bunch! When I pointed this sort of thing out, I got “This is the standard Epic workflow now”.

:rofl:

1 Like

Oh come on, I’ve been doing this type of work for about 15 years in 3D modelling, texturing and programming. What a joke.

Quality is in the details. Not the details you get when you scan a brick and import 20.000 vertices into your software, but the details you implement because they have actual value.
Artistic value, performance value, details implemented because the end user will actually see and care about them. If something doesn’t have that value you simply don’t implement it. Saying “here’s a million vertices in a virtual teacup” does not have that value. Maybe if you are a museum which absolutely needs a top quality virtual scan of a teacup but definitely not someone who wants to play a game.

Thinking of it you say this creator is a pro but implementing more detail than you require is a beginner mistake. Some of the absolute first things you learn when modelling objects and environments are:

  1. Details must match, you can’t have one object ultra detail and the other lower detail, because a high detail object makes another look bad.
  2. You don’t implement details you don’t see. This applies for literal invisible details (like a screw model inside a piece of wood) or an area you can’t visit with the character.
  3. You must be aware of how much detail you can see from the camera’s perspective. implementing any more is just bad for performance.

Now I could list a lot more of beginner knowledge and you can’t just ignore it by saying “Oh but Nanite does it for you”. Right now I am not interested in working with Nanite because it is not relevant for the current projects. It is not relevant for the target hardware. It is not relevant for the development workflow.

I’ll die laughing while I drown in my coffee when the next Crash Bandicoot -like game is made with Nanite.

4 Likes

i guess the decision point for an artist is, “is there greater cost to get surface information from a normal map, or geometry?”

it does seem stupid to have a concrete brick have millions of triangles. But what if you desire a high resolution for the surface details. Would a 2k+ normal map cost more than the nanite geometry which negates need for normal map?

And also there is factors beyond sheer optimization. If one workflow is easy and fast and the other is arduous, you can consider times-savings cost as well.

I didn’t look at the assets in question, just making some broad points about nanite from what I understand about it. I still havent downloaded ue5 yet and probably wont for another year though.

1 Like

Modellers - please read this and take note - nanite is great for feature meshes, but when you release things such as a 400K tri block pedestal you’re going to loose all your credibility like this vendor has just done.

7 Likes

Seconded…

2 Likes

You’re right, I’m sure it is an easier workflow. But that fact remains that probably over 90% of game players out there can’t run this stuff on their machines ( for a few years at least ). And that will mean that 90% of the stuff on the marketplace is pretty much useless, if everyone goes down this route.

1 Like

I meant in relationship to the entire pipeline, not just what is easiest for the artist but causes problem everywhere else.

My hunch is that it’s just a lazy artist and it’s not a good practice, but, with some testing it may be confirmed that in a lot of cases, what seems like overkill mesh might actually make more sense than a traditional mesh + normal maps.

I dunno if this is the case here or not, i guess if the author wanted to make a case they could.

In fact, the author did say there was enough poly detail to not use a normal map ( although there was one ) :slight_smile:

I think shipping both might be an idea… ( low and high ). Not too sure about that though.

With something like a pedestal, unless you’re prone, right next to it, it will never be closer than say 1.5 meters - more like 1.8 meters (straight down, usually it would be more with an angle) - for things like this you’re not going to want ultra high def meshes (well, I don’t in my projects).

I’m interested to find out more about the over-all performance and memory consumption of Nanite and high-def meshes - it may be true that a high density mesh without a Normal map could be more efficient in some situations, but I’m thinking it would be more VRAM in almost all circumstances - especially if the normal map has been created small to allow for the fact it’s never looked at close up.

Also, Nanite appears to create Imposters for everything - that means an extra set of textures for them (I don’t know how they are managed, but there will be some overhead involved) - having said that, they could be why Nanite performs so well - a single draw call for the far away meshes.

1 Like

Coming from film CG industry. I can tell you this. It’s a money grab on the UE market at the moment. You can just sculpt something in Zbrush and throw on Nanite and call it a product. Or buy stuff from Turbosquid or evermotion then change a bit and put Nanite on, then resell to people who has no clue here. The thing is, clean good topolgy takes skill and knowledge and extra work. My suggestion is, start making some of your own model to a point when you can identify product from trash.

2 Likes

As a VR dev…Nanite has no current value due to not yet being supported in VR. I do purchase non-VR assets for use in VR as there is usually no problem in doing so…but Nanite will start to become an issue if I have to continue to disable it in order to initially use an asset.

What needs to be done is creators of assets need to either supply both Nanite and non-Nanite versions in their folders or give a user the ability to download a non-Nanite version only if Nanite is unwanted.