Looking more into Revit + Datasmith, I’ve noticed most Arch Viz scenes are mostly smallish interior apartments. I’ve seen a brief few stills of an Airport and the Heydar Aliyev Centre, there’s also a stadium showcase project from HKS…all great.
In our own tests, we can get large 20 story mixed use development Revit models into Unreal. These are large datasets with everything, from toilets, basins, car parks in the basement, basic street etc…
It seems to work fine, and FPS is great but the issue is that when we look to bake lighting. 32GB workstations will instantly run out of memory even with GPULightmass. The other issue is very slow save times where it can take 30 minutes to save all assets of a project (wall geometry, etc…)
Is the current strategy to showcase large projects, simply look at monster computers with +150Gb of ram on Google/AWS machines where memory is not a problem? So kind of brute force it with modern hardware?
Or is it best to break up the model by levels, so that one walks around a foyer, next level is a typical floor and so on thus reducing stress on a computer? Deleting all unnecessary objects like basement details etc…
The GPU is a P4000, so only 8GB. It runs out of memory so will look to section cut the model and just break it up by levels as thats the most effecient.
Realtime just can’t handle massive amounts of data just yet it seems.
We don’t do large architectural CAD models, but we do large product design CAD models like boats, and I have to spend a week or so just optimizing parts (removing whole sub systems like electrical wiring, etc, deleting holes, screws, merging objects).
The hardest thing to cope with is that some of these files are 2gb. Maya, Max - all these programs save files really quickly to servers, but Unreal takes up to 10 minutes to do save which is hard to deal with. Works flawlessly on smaller models though, just not large CAD datasets
The only problem I’ve had with long wait times was in 4.21 to delete Datasmith files from a scene (which could take up to 40 minutes), but I think they fixed that in 4.22…
But you said “to servers”, does that mean you also save your Datasmith exports to a network volume? In my bug report replies, I’ve been told by Epic staff that all files, including files you import, should be local on your machine, which I must admit baffled me (because what real world company operates like that), but it’s one thing to try.
Ah. Part of the reason I suspect for this, is that in a normal Unreal workflow with version control… the project files go no-where near the file server, in the traditional sense. All work is done on local copies on the local machine, and changes are committed back to the repository, (or equivalent), without going near a file share. It’s a fairly standard programming workflow, and similar to how you’d work with Revit central files; it makes a local copy, and send the changes back.
Most of this is to enable collaborative working, (As an Unreal project on a file share can only be opened by one person at a time), and the view that the .udatasmith files exported from the things that can export them, are considered transitory data at best, and aren’t normally something you would save or backup on a network share either. That and it can really save your bacon if something goes horribly wrong; just your local copy is affected, not the one everyone is working on.
Ok, so putting project onto a local disk (backup drive on this workstation is a HDD, not SSD) saw a tremendous increase in save performance.
Which is odd as the server is a 10gb RAID 10 and fairly modern.
DIdn’t expect it. But will continue to work locally then. Thanks for the tips.
Main worry was Swarm not being able to function properly off a local disk but other workstations are pickig up from it so all good!