A while ago, I noticed the existence of the node shader stage switch in the material editor and I can’t find any documentation on how to use it. It seems like an interesting optimization, but how to actual use? What’s the difference, for example, of passing some part of the material by the vertex interpolator or the VertexShader input of this node? And how do you combine results on it?
Regards
The main difference is that a vertex interpolator takes a value computed in the vertex shader and smoothly interpolates it across a triangle’s surface for use in the pixel shader. Adding vertex interpolators increases the number of interpolants the fixed-function graphics pipeline needs to handle during rasterization.
The Shader Stage Switch behaves like a simple branch. It allows you to define different logic to execute in the vertex and pixel shader stages. It evaluates the vertex input when used in the vertex shader (when driving a vertex interpolator, custom UV, or World Position Offset), and the pixel input when used in the pixel shader (for example, Base Color, Metallic, Roughness, etc.). It does not introduce any extra interpolants since it does not pass data between shader stages.
A typical use case for Shader Stage Switch is to share a common node chain between vertex and pixel stages but control which logic runs in each. For example, you can place a Shader Stage Switch after a vertex interpolator: you pass the interpolator result to the pixel shader input and bypass it in the vertex shader input. This allows you to reuse the same output in both vertex and pixel contexts (e.g., for World Position Offset and Base Color).
Note that plugging a vertex interpolator directly into a purely vertex-stage input (like World Position Offset) is invalid and will cause an error, since a vertex interpolator output only exists in the pixel stage.
Regards,
Lance
There are no official content examples that I know of, but I have made this example to demonstrate an example usage.
In this example I am setting the UV scale (controls scale of texture on output) in the vertex shader. This is a small but easy optimization that can be done for most shaders without affecting the visual result (as long as the UVs are scaled linearly).
UV scaling in the vertex shader is also usually required on mobile platforms, as those platforms generally use half precision floating point values in the pixel shader as an optimization. This causes mobile platforms to produce blocky sampling precision artifacts if UVs are scaled in the pixel shader. The vertex shader, by contrast, always uses full precision floating point values. So we can safely scale UVs in the vertex shader on mobile.
You can see in this example that I am using a named re-route node (ScaledUV) to keep my shader graph clean. However, since this named reroute node is used in multiple shading contexts (for the world position offset path as well as base pass and normal), we need to use a shader stage switch to ensure that the operation is valid in all contexts. If I remove the shader stage switch, this material fails to compile due to trying to use the a vertex interpolator node in a vertex shading context (world position offset path). To be clear, everything on the left of a vertex interpolator is forced into a vertex shader context, while everything on the right must be in a pixel shader context. This means we can’t use it for world position offset, which always operates in the vertex shader context.
Another good example of where this kind of pattern is useful, is when using a vertex interpolator in a material function. If I want the material function to be usable in both pixel shader and vertex shader contexts, I would need this same kind of vertex interpolator -> shader stage switch pattern.
[Image Removed][Image Removed]
Is there an example of usage in the content samples or another project? Still a bit hard to visualize it