It took over 45 minutes to delete 10500 objects from a level

I went to lunch.

Is improvements in this area planned?

(Also, I had to do this because I don’t trust Datasmith re-imports anymore, after I got many packaging errors from re-importing files that I was working on over and over where the contents changed a lot.)

EDIT: This was just for deletion from the level. Deleting from the contents folder took another good 15+ minutes. That’s a good half hour where you’re essentially blocked from working with Unreal just because you want to import something anew.

EDIT: The “cleaning up references” step of the above procedure was another 15+ minutes at least. We’re up to 45 minutes now.

As an alternative workflow for this issue, I have gotten into the habit of using level streaming and importing one datasmith import per level. If I want to delete the import, I can remove the streaming level from the persistent level, close unreal, and then delete the imported folder and related level directly from windows explorer. It only helps when you want to delete the entire import (which sounds like what you are doing in this case). And if you have used anything from the import in other levels, it will mess things up, so you have to be careful about managing the project.

Not a perfect solution by any means, but helps me a lot in early stages of a project when I’m doing a lot of re-importing

Edit:
Also, reducing the number of objects will help performance here, and also reduce draw calls during runtime and so should help performance over all as well. This may not be possible or practical depending on where the geometry is coming from, but for example, if I’m importing a railing with lots of verticals, I will join the entire assembly into a single mesh, reducing 100 objects down to 1.

The level streaming was a nice tip! Thanks, I’ll look into that.

Regarding object count, yes, all will be reduced in the end.

But I’m talking about getting there, and right now, Datasmith doesn’t really support getting there (since it chokes).

It wants a clean, nice, finished, optimized model, but that’s not where you begin, and you want to see where you need to put in effort, hence an early import.

Also, in this case, these are models already converted and textured by our media guy, so it’s not even completely raw from CAD, and still there’s a bit of a gulf between Max and Unreal. I hope it shrinks in the future (and end up where you can skip Max alltogether, but I think we’re pretty far from there, and that’s also off topic now). :slight_smile:

Totally understand the need to for rough early imports. Even for those, it can still be worth setting up a system to join things. For example, I work with Rhino and have a script that takes everything on a layer and meshes/joins them into a single object. It might take a few minutes to run over a whole file while I go get a coffee. Its no good for things with texture mapping or light baking; the final product certainly needs a more careful consideration of what to join and how. But for the rough early imports I maybe don’t care so much, and it speeds things up so much with file count in unreal that its often worth the time.

This is, verbatim, what I was going to suggest. Our DS files are something like 20,000 objects and I’ve missed deadlines early on when learning Unreal due to this. However, I’ve now learned to break that one DS file into perhaps 5 (architecture, casework, furniture, lights, structural) and create 5 corresponding levels, each with nothing but one of those DS files on it. This gives me a few benefits. One is that if the architecture changes, I only need to export/import that single file instead of EVERYTHING, and two if I need to wipe it clean and start over I can do he process quoted above. Only takes about 10 minutes total which isn’t bad.

The other benefit is the more you split them up, the more artists can all work simultaneously. I can have someone working on architecture materials, someone working on lights, someone working on furniture all at once with Perforce.