I have two textures that I am trying to blend from one to another for various various objects and walls etc. I have the regular texture map and the dirty texture map. I want to blend from the dirty texture map to the regular texture maps.
Methods I have researched so far…
-decals-
I don’t think this would work because I would need a million decals to make an object sufficiently dirty and this would be computationally heavy.
-vertex paint-
From what I can tell this can’t be done at runtime can it?
-runtime virtual textures-
This method seems limited to simplistic geometric forms or landscapes. or is it?..
Maybe the methods above are sufficient I’m not just familiar with how they can be used. Is there another approach to solving this problem?
Okay I have something that works now. The problem now is when I have to cross the UV seams of an object, it leaves a hard edge during the raycast until it cross over the edge.
Is there a way to reference surface materials and render targets from pixel data? What is a smart way to deal with UV seams?
I am wondering of the downsides of having multiple traces when crossing UV seams do I have to be concerned about how computationally heavy this could get?
While blending textures in real-time is a different challenge than engine repair, both require precise adjustments to achieve optimal results, whether in visual effects or mechanical performance.