A question on high poly counts

So I’ve been doing some work with very high poly assets on the order of 6 million polys per object, in a variety of scenarios. Part of the reason for this is it’s a semi-requirement for something I’m working on and partly because I know DX12 will support massive polygon counts compared to DX11.

My results after working with this geometry have been mixed.

  1. I’m able to import and process large data sets in my plugin fairly quickly, including normal smoothing and vertex de-duplication.
  2. UE then struggles to handle those results, particularly if Generate Adjacency Buffer is turned on. This really kills the import more than another other option.
  3. For a 500k poly object UV unwrapping fails a lot. UE loses a lot of geometry when it tries to create a lightmap UV although it does return a simplified result. I tried unwrapping in C4D as a possible workflow alternative and it ran all night and then bailed from the operation and closed the file without saving.
  4. Even after importing a 2 million poly asset successfully, the framerate in UE tanks. 5fps on a GTX Titan (DX11, not up to testing with DX12 yet).
  5. Unity performs reasonably well with the same mesh.

It doesn’t look like UE is ready for high poly geometry. Is anyone able to comment on this, share their experiences, suggest some possible ways to deal with this?

Where did you read that?
I have been following D3D12, And I see NO reason that models will get renderered faster outside of imporved culling.
D3D12(low level graphics API’s) aim is to lower the cost of sending commands to the GPU for processing.

besides, the model lives in VRAM.

Are those 500K+ counts for all objects?

HTH

I’m basing that on things like this: DirectX 12 Can Push '6 to 12 Times' the Polygons of DirectX 11 - IGN - apart from this, it doesn’t seem unreasonable to expect polygon counts to continue to grow as they always have. I’m just finding that the tools are not keeping up.

That’s 500k for one object. The 2 million poly object is broken up into about 8 pieces, although I’ve been telling UE to merge the mesh on import.

Interesting, I would like to see the test that they did.

The studio that is making “Ashes of the Singularity” released YouTube video showing the performance difference between D3D11 and 12.
But that was on the same scene, But the D3D12 was smooth wheres D3D11 was stuttering.
I would like to see such a test done with poly counts where the ONLY difference is the API used.

Have you tried using lower polygon version of the mesh to see where UE starts to suffer the most?
Have you looked at the stats group (I think thats what they were called) and done some GPU profiling?
It may be that the issue lies elsewhere in the pipeling.

Why are you not up to testing the D3D12 RHI yet?

HTH

Not an option. I’m building an import tool not a game so pretty much none of that is relevant. 500k was an optimistic polygon count, I’ve had users send me single objects with tens of millions of polygons. Obviously in that case the answer is education more than anything, but 500k is not an unreasonable amount to want to unwrap.

GPU-anything is going to affect the availability of good UV unwrap tools unfortunately.

It isn’t relevant until my users can use it, but I’m looking forward with the expectation that most of them will want to take advantage of DX12 since they’re not game developers and can’t typically produce low geometry assets. I **will **be asked the question “DX12 can handle 60 million polys, why can’t we import that many?” and the answer will be a mix of “because Unreal Engine is ****ing terrible at handling large geometry” and “it loses polygons when it creates the lightmap UV channel”. Neither situation is ideal and it doesn’t help when my co-worker can fire up our 500k mesh in Unity and it runs perfectly.

I’d settle for an automatic unwrap tool that just works that I can push our users towards. I’m going to give Blender a go tomorrow as I hear its unwrap is quite good.

There’s no good automatic unwrap tool, most likely on a mesh that high poly it will be really really terrible results anyways. What type of thing are you doing that’s 500K polygons anyway?

I don’t know that DX12 can have any increase in the number of polygons, the main improvement is how it handles draw calls which is currently the biggest issue in games performance right now.

Also, as far as poly count goes in UE4, by default there’s stuff like post processing that’s on by default, which will significantly impact performance.

Keep in mind it’s not unwrapping for hand texturing by a human - well packed individual polygons would be absolutely fine.

The thing we’ve been sent are stuff like entire apartments from AutoCAD with high detail furnture props in them. Basically archvis by people who don’t know what polygon counts are. The 500k model is a house exterior where 95% of the polygons are in the friggen roof tiles.

Ouch, Could you try unwrapping in other 3D apps such as Blender ?

Could you try running the model through a quick polygon reduction to remove a few in percentage to see if it would be unwrappable then?

Automatic UV’s most often create too many UV islands, which means you waste a lot of space between the islands, if it’s very high then even at the highest lightmap resolution there will be significant issues due to bleeding since you can end up with very small islands that are smaller than pixels.

That’s going to have to be the plan for some users I think. Really bad compromise IMO unless I can check for a Blender install and then use it from the command-line or something though. I’ll have to take a look tomorrow.

Tried this early on when the problem was straight up too much geometry. I set it up to run the incoming geometry through MeshLab during import and the results were bad. A 16x16 tessellated cube couldn’t maintain planar surfaces. VizUp did much better but they wanted way too much to license their SDK. I couldn’t find any other options from googling alone.

Bleeding isn’t a concern as long as there’s a couple of pixels between islands, our renderer can pad them out and overdraw. And as we’ve seen, UE doesn’t set the bar particularly high when it comes to wasting space: http://i.imgur.com/c7AO2tB.png If the islands in this example were simply enlarged to have a couple of pixels space between them it’d be perfect. But right now UE can’t even maintain the same geometry as it was given, so I’m giving up on using a derivative of the lightmap results for now. It’s a shame because one option was to iteratively creep the islands bigger and create a new UV set, but with missing geometry that’s not going to work.

We’ve had this discussion before and that’s the image I posted last time. My use case is automatic unwrapping of geometry by inexperienced users and there’s no way around it. It might sound crazy now but give it a year and everyone will be expecting this amount of geometry to be OK under certain circumstances.

I don’t expect anything to change about automatic UV methods, especially as games move more to dynamic lighting which won’t require lightmap UV’s. For now, if you want good results, you’ll need to likely need to make a lower poly version of the model and do proper UV’s

Static lighting won’t be abandoned as a lighting technique for a long time to come yet. Dynamic lighting is still too far off from what a static renderer can do.

We should probably stay on topic as the last thread veered off course after you posted pretty much the same things darth. Let’s keep it relevant to handling potentially bad user content in a plugin rather than assuming I’m making all their content myself.

There’s not going to be a solution for what you want to do–game engines are designed to work with optimized content. UE4 has a pretty slow importer, which is the main problem you’re running into, but it wouldn’t have a problem for assets that are optimized for real-time applications.

All my users are trying to do is a little archvis based around how they are being targeted with the above marketing. Seems like it’s going to happen even if you feel like it’s out of the box. I’m just asking where the appropriate tools are.

Anyway, now my thread is a baseless argument with a moderator of all people. Thanks for the help everyone else, I think we can close this one.

That’s not what those are about, first one you can bet they have optimized their assets for the engine. Second one has absolutely nothing to do with UE4, that’s a DX12 demo for technology used in the Final Fantasy 15 engine which is not UE4, not only that, it’s running on 4 graphics cards. DX12 in UE4 is an experimental feature and probably won’t do what you’re expecting. If’s fine you’re wanting to do something the engine isn’t really capable of, and surely they’ll want to improve things, but for now you’ll have to use the tools in the way they’re designed.

If you watched the enterprise live stream (now on YouTube) a lot of work went into making their car demo work in real-time. Even though the car is really high poly, they didn’t just import the CAD file and call it done. UE4 enterprise isn’t presenting itself as a viewport for million polygon assets, and it’s not designed to accommodate them. I think one of the photogrammetry scenes they showed off was over a million polygons, but it was properly unwrapped.

One option would be to build a separate tool (or function) that imports a very large object to raw RAM, and then splits it up into smaller pieces, which are then separately imported into Unreal. That could probably be automated reasonably well.

The issue with the importer is all the extra stuff it does, like automatic lightmap generation and materials and stuff.

The best automatic tools for polygon reduction, UV unwrapping optimisation and collision building will fail with certain meshes.
Even if they don’t fail, the results are often not as good as if it was done manually.

Users that put in unoptimized things cannot expect good results.

I can create a single roofstone with millions of polygons if i want to, but i shouldn’t blame tools that are designed for rendering lowpoly geometry,
if they fail to handle this ammount of polygons.