Hello!
We have a bit of a problem with creating variation for windows in our procedural buildings. I’ll try to explain it as best as i can.
So we are building an open-world game, which will take place in a city. For buildings we’ve decided on a procedural approach, which uses premade elements. Here is an example, consisting of few of these elements put together to simulate a building facade (really crappy one):
Now, while we have no really difficult issues with combining the meshes together, we’ve ran into the problem of making windows visually different. As you can see on the screenshot above, we’ve managed to sort this out through materials. What is done here is pseudo-random UV offset based on object world position, using texture atlas. Works well enough.
But then, in our building creation pipeline, these elements have to be merged into a single static mesh to save draw calls. Here’s what we get:
I’ve may not made the perfect example, but you can see that, say, single windows on the left are now the same. It happens because now all of them have only a single world position to work with, since all are in the same actor. This case may not look that bad, but there are worse scenarios, believe me.
So now we are kinda stuck and not sure what to do here. I’ve tried to come up with several solutions. For example, I’ve wanted to use vertex colors to randomly offset UVs. That would work great if we could paint vertex colors on separate elements (either manually or programmaticaly). But we can’t.
Other idea we have is to place sockets in these elements where lower left and upper right corners of windows should be and then generate them in code when creating buildings. But this way is resource intensive and won’t be made soon, while we need a solution now (starting on the demo project).
So my question is, can you recommend some way to do this? Maybe some HLSL code to get vertex IDs, or calculate UVs shell, or paint vertex colors? Anything?
Thank you in advance,
Alexander