UV Texture Calculation

I’m always wondering on how UE calculate UV input in TextureSample node and TextureSampleParameter2D node. For example, if I have this simple PNG image below:

92724-colormap.png

If I input 1 vector/scalar value, then it will produce yellow color. If I input 2 vector/scalar value then it will produce yellow value as well. If I input TextureCoordinate node, then it will produce this same image, but can be tiled based U or V value.

The reason I’m asking this is that, in the Starter Content, there’s a material called cobble. If I apply that on a mesh, then it will produce an amazing 3D effect out of 2D image. I studied the material nodes calculation, but it’s very confusing. If anyone know how to explain this complicated material, please help me.

What I’m more concern is about the UV input in those texture nodes. How does UE utilize this input node anyway?

The following information has nothing to do with Unreal Engine specifically, but is more or less a basic explanation for triangle rendering.

A model exists out of vertices. These are usually made by a 3D artist in a modelling program like Maya / 3ds Max or Blender. Every vertex can have information like its position, normal, tangent and uv’s. These can all be set inside a modelling program. In the Unreal Material editor you are create a so called shader. The first program in this shader is a vertex program. It is basically a function running on every vertex currently being rendered. Unreal does most of the work for you, however you are allowed to give a world position offset to the vertex. In this program all data which is needed to render the pixels on the screen are given to a so called fragment program. So the uv stored in the vertex is passed along to this so called fragment program. This fragment program is running for each fragment on the screen. A fragment is basically building up a pixel on the screen. A good example is for example the sand in a river. The sand is a fragment which is blended with a water fragment in front if it the create the final pixel on the screen.

So, back to the texture Sampler node. The only thing this does is a lookup inside the provided texture. This is usually done using the uv’s stored per vertex. And a uv is just a value for the x and a value for the y which is usually between 0 and 1.

Here is a simple example with your texture applied to a quad:

[0,1]-------------------[1,1]
|                           |
|                           |
|                           |
|                           |
|                           |
|                           |
|                           |
|                           |
|                           |
[0,0]-------------------[1,0]

There are 4 vertices processed:
Bottom left (given uv coordinate float2(0, 0) by the artist)
Bottom right(given uv coordinate float2(1, 0) by the artist)
Top left (given uv coordinate float2(0, 1) by the artist)
Top right (given uv coordinate float2(1, 1) by the artist)

So the uv’s are given to each fragment. The fragments are completely depending on the screen resolution.
This quad could cover an area of a 100x100 pixels. This means the fragment program will be executed 10000 times.
Each fragment gets an interpolated value from the 3 closest vertices. So if a fragment is in the center of this quad it will get a uv value of float2(0.5, 0.5). In Unreal this node to get the stored uv is called TexCoord (texture coordinate). The default channel to store it in is 0, but the artist has the option to use several uv channels if needed.

So by hooking up the TexCoord node to the TextureSampler you get the uv’s created by the artist and stored in the vertex used to look up the color inside the texture. Because the texture semplure will just get the interpolated value between float(0,0) and float2(1,1) and return the color. So in your example value of float2(0.75, 0.5) ends up exactly in the center of the orange square.

Cheers!
Roel

Roel, this is an amazing explanation. But about the texture coordinate system, is it really like that?

 [0,1]-------------------[1,1]
 |                           |
 |                           |
 |                           |
 |                           |
 |                           |
 |                           |
 |                           |
 |                           |
 |                           |
 [0,0]-------------------[1,0]

Shouldn’t it be like this:

 [0,1]-------------------[1,0]
 |                           |
 |                           |
 |                           |
 |                           |
 |                           |
 |                           |
 |                           |
 |                           |
 |                           |
 [0,0]-------------------[1,1]

I thought that the origin of the vector is always from the top left of the texture image and not the bottom left. I tried to modify the UV tile value in TexCoord node with [0.5 , 0.5], and it showed from top left to 50% to the right side and 50% to the bottom side. CMIIW.

If we are talking about a 2 dimensional coordinate system X is usually horizontal and Y vertical and all the values are written like so: (X, Y). What you wrote does not make much sense where on a flat horizontal line X and Y both become 1: (1, 1).

If you would like to visualize uvs for your object just to see how it works you can hook up the texcoord node to the emissive channel. X-Y-Z will be visualize R-G-B. In this case there is only X-Y. So at (0,0) you will see black, at (1,0) you will see red, at (0,1) you will see green and at (1,1) you will see yellow (red and green make yellow).

Roel’s explanation is great.

I’ve done a video tutorial recently, if animated slides will make it easier to understand: UE4: Dynamic Texture Coordinates (Animated, Generated UVs) - YouTube

Fantastic video Oskar!