Hello, so I am having this problem. I am trying to import a scene that has about 9m polys, that was exported with Datasmith from Navisworks. I had to export it separated in 6 parts because of RAM. I have 16gb of ram (plus paging, it goes up to 42gb).
The thing is that, when I load each datasmith file (which inserts all the meshes into the level), the memory consumption increases, which makes sense. But when I press to save all those unsaved assets, the memory consumption further increases while the saving loading bar advances, and won’t decrease even after finishing. So there’s a point when importing each datasmith file and saving it, that the PC exceeds the 42gb of total ram and Unreal crashes.
The scene has about 30k-100k objects (mesh actors).
So, my question is: is there some more efficient way to do the process I am doing (like merging meshes after each import, or saving the assets in batches), or do I simply need more RAM? In other words, how do I know that if I add 16 or 32gb more of RAM I won’t have the same problem with a scene of a given complexity? Is the RAM requirement proportional to the number of meshes or polygons in the scene? Or is there any smart way to handle a scene of any complexity with a fixed amount of ram?