Working With Very Large 3D Datasets in Unreal

I’ll preface all of this by saying that I am very new to UE and am still assessing feasibility for an upcoming project.

I am looking for some advice on working with Digital Twins in Unreal Engine. We are currently working on an arch-vis project and the city we are working in has a really good Digital Twin that is available for free online in multiple formats and can be downloaded.

The dataset lets you download in a few different formats and groupings - I have choses to use FBX and individualised FBXs as this is what best suits the rest of the project. I have been able to import all of this into UE by breaking it up and being very patient.

The big question is where to from here - I need to get these models to a point where they are responsive enough to use for close to real time vis but the whole model when textured is far too heavy for people to work effectively. What am I missing here to make this work? I was under the impression that using Nanite would do a great deal to make this environment more workable but it doesn’t appear to be doing anything.

There are several things to consider and thus you will have to do some analysis to figure out what are the bottlenecks and figure those out.

  • Number of triangles that you scene have to display: this is where nanite can help.
  • Number of static meshes: if you have several 10 of thousands of meshes it would be good to merge static meshes.
  • If you are visualising very wide open space then you will need to look at world partition tools and level instancing
  • there is some work to be done on materials and textures: are material too complex, make sure there is no duplicates, etc…
  • some other light / material / rendering checks