Vector3 to Texture in materials


  • on one hand, I have a material function which does texture blending based on distance to prevent tiling on landscapes. This function takes a TextureObject as parameter, this texture is blended with itself using different UV mappings based on the distance.
  • on the other hand, I have another function creating a snow layer and outputing material attributes, amongst them a base color, normal etc. Now, I want to pass the output of this function into the one above about tiling. For this, I need to convert the BaseColor (Vector3) to a TextureObject. I didn’t find any way to do that so far…

In the same type of issue, i also need to set the output of a Lerp node into a TextureSample, and that doesn’t work (“Float is not compatible with Texture” error msg).

Things do not work that way. Texture is just a texture. The only thing you can do with a texture, is check what it contains in specific place, but nothing else.

I worked around the problem by applying the tiling function before hand instead of on the output of my snow layer. However I am wondering if there is any technical limitation in transforming Vector3 or Lerp output to Texture, this would be handy to have such a functionality.

The reason you cant convert a vector3 to a texture is because they are different things. a texture is just an array of vector3 values sure, but you cant convert an array of vector3 values to a texture without rasterizing that array. its possible sure, with some render-to-texture blueprints and texture render targets, but… that’s super costly and not efficient to run every frame (which is what materials do). So yeah.

What I would do is: where you want your snow to be, instead of asking the function for a texture object to blend the snow in, instead just have a vector3: the ‘white’ output from a texture sampler is the same as a vector3 anyway.
The only time you’d need to have a texture object as an input instead, is if the function you’re plugging it in to had some UV mess going on, in which case I’d tell you to separate your UV tiling out in to a separate function.
to do that, you’d have your base texture and snow texture as texture objects. they would then each go in to separate functions that did your UV blending based on distance, which would then both go in (as vector3s) to the main function that does the blending between the base and the snow.

What does your function workflow look like? I can probably help you get what you’re after and help you understand why it does what it does.

Hi Construc_, yes so I came to the conclusion that textures objects/samplers need to be used only at the beginning of the graph flow when UVs are concerned and then all the rest (functions inputs/outputs) now work with Vector3 or Scalar depending on the data type. This requires to perform all the UV based transformations early in the graph but that is ok, just a matter of knowing about it :slight_smile:

My flow is as follows:

  1. distance blend input textures to prevent tiling
  2. pass the vector3/scalar inputs into a snow cover function
  3. snow cover lerps inputs based on various factors, outputs a material
  4. this material can then be used into a LayerBlend or whatever

One issue I still have: unlike most texture components which are treated as Scalar, the displacement map needs to be treated as Vector3. I didn’t find how to let the system know to interpret the Scalar output of a Lerp as Vector3… so for now i am using the displacement map untransformed, directly as output of a texture component (R G or B output of a TextureSample). This must be possible since the system is able to interpret a single component output as Scalar or Vector3 (directly from a TextureSample node) depending on where it is plugged into.