I know most people paint their ground and then use PCG to generate trees from the ground. I kind of want to do it the other way around. Been at this for a couple of days now, and i’m stuck.
I have an actor that has a spline component where i define a closed loop area, it also has a PCG component with a graph that takes a data asset with weights and tree meshes and generates trees in the spline area. Works fine.
I would like to change the ground material under the spline to a texture defined in the data asset.
I have checked out 2 possible ways of doing this.
-
Somehow create a brush from the spline shape and fill it, then use that to paint on a runtime virtual texture that is blended in via my landscape mateiral.
-
use a brush for each tree and paint a circle, or other predefined mask under each tree.
The first problem with 1, is that i have no idea how to create the filled spline texture.
The second problem, that i get with both 1 and 2, is that the few outdated examples i have found that adds information to a runtime virtual texture is using rendertargets, which will be inconvenient when multiple actors need to access it.
Is there a better way of doing this? I feel like implementing some form of generation queue system to share the render target is something that i could do, but rather wouldn’t if there are other alternatives.
TLDR:
I have an array of coordinates, which i would like to iterate over and paint a mask at each point on a runtime virtual texture. Or make a texture out of a spline loop (fill it) and paint said texture on a runtime virtual texture at my actor’s coordinates.