Lightmap resolution: Is it fair to say a single 1024 = 256 x 64?

Hi All,

I’m creating a very large Archviz scene and the light build takes a very long time.

Am I correct in presuming a single lightmap resolution of 1024 (i.e. 1024x1024 = 1,048,576) equals to having 256 lightmaps of 64 (i.e. 256(64x64) = 1,048,576).

Now imagine a single floor piece.
Does this mean I would get better performance and quicker light build times if I broke the floor into 100 smaller pieces (each having 64 lightmap) instead of having it as a single object with a 1024 lightmap resolution?

Currently in my scene I have:
283 SM with 256 lightmap
138 SM with 128 lightmap
800ish SM with 64 lightmap
1100ish SM with 32 lightmap
100ish SM with 16 lightmap

So if I was to break down the 283 SM into (lets say) 600 different pieces
and broke the 138 SM into (lets say) 250 different pieces
with all new pieces having 64 lightmap
Would this mean I could get similar lighting results at a considerably lower build times?

Many thanks.

Well I didn’t get any answers, but I did go ahead with my assumption, and I did manage to save a considerable amount of time.
But if anyone still has anyways, feel free to share…

I’ve never made that kind of a test on build times but splitting meshes is fine if you’re not getting any shadow bleeding artifacts or difference in shading due to vertex normals. Also keep in mind that performance wise less object(though this can also change depending on the poly count of the mesh, design of the level, etc., but in general less is better especially for simple models like a wall or a floor mesh) is better so keep an eye on draw call as well.

Basically, I think your assumption is right. Because for large lightmap which contains more complex geometry, the complexity will rise nonlinear for the lightmap compression and 2d packing.
Actually, you can see the swarm agent to see what stage the time is reduced most.

Basically, I think your assumption is right, since large lightmap which covers more complex geometry will be more to difficult to compression and 2d packing.
And the complexity rise should be non-linear.
Actually, You can checkout the different time in swarm agent to understand which part cost is reduced.