In the Unreal Documentation, it says you can do world space texturing with custom UVs.
Does anyone know how to set this up with something like world aligned texture
Im trying to get this to work on mobile
On mobile, any texture sample that manipulates the texture coordinates in any way takes a slow path. These are called dependent texture fetches. By using the customized UV inputs, you can still implement tiling or world space texture mapping while keeping all the texture fetches independent, which is the fast path.
Additionally, everything in the pixel shader on mobile is evaluated with half precision floats. This causes blocky looking textures and some other artifacts, when pixel shader math is done on texture coordinates. The Customized UV inputs however are done with full precision so they get around this problem.
Here is a cave material setup to use tiling at two different rates, but still only have independent texture samples.
You can choose the force full precision property on your material which should make it work by forcing all pixel shader operations to run in high precision, although it will render slower than using custom UVs.
The problem with WorldAlignedTexture is that the parts of the material that need to be moved to custom UVs are inside the Material Function itself. You could duplicate the function and make a mobile specific version. The changes you need to make are as follows:
The Divide node gives a 3-component vector for the XY and Z textutre coordinates. Instead of splitting it the 3 masks which generate the YZ, XY and XZ texture mapping coords respectively, you should split it into an XY (RG mask) and a Z,0 vector (G mask with an AppendVector of 0). You need to export those out of the material function with names CustomUV0 and CustomUV1. Any material using this function will have to connect those outputs to CustomUV0 and 1.
Finally you need to modify the inputs to the 3 textures. Instead of directly using the YZ, XY or XZ coordinates directly, you need to replace them with a pair of TextureCoordinate nodes looking up texture coordinate 0 and 1 which will get the customized UV data we generated earlier. Coordinate 0 will give X,Y in a 2-vector and Coordinate 1 will give Z,0 in another 2 vector.
You need to use ComponentMask and/or AppendVector to reassemble the YZ, XY and XZ vectors which you can connect to the 3 texture samplers.
Hey Jack, thanks for the help. I just had trouble on the last part. How would you finish off hooking up to the texture samplers? ELI5… sorry… I really put in an effort (10 hours) to understand RGB channels this weekend, but got lost…
Am I doing this all correct. Pic 1 is the first part, Is it actually a mask of G or was that a typo? Pic 2 is what I have so far for the last half.