Gerstner Wave Normals

I have been able to create Gerstner waves for world displacement and it looks great with wire frame but when switching to lit the waves aren’t noticeable at all. I assume this is due to no normal map. I have searched around and found a few answers but am still having trouble.

http://http.developer.nvidia.com/GPUGems/gpugems_ch01.html Is what I used originally to figure out how Gerstner waves worked and it spoke of normals but being new to all this I can’t seem to wrap my head around what I am supposed to do to calculate the normals from this. I had also read something about finite difference approximation as an alternative to what is written in the GPUGems article but wasn’t able to figure out how to apply that to this either.

I was hoping someone might have some specific information on how exactly to get normals from the Gerstner wave equation within Unreal Engine 4 or if there is some way to get the lighting to react to world displacement.

I hate asking for help with my first post to these forums but after searching around and attempting this for a week straight I am not sure what else to do. ^^;

With both displacement and world position offset, it’s assumed that the effects of the displacement are already contained in the normalmap. This is the right choice for offline processed meshes from 3dsmax, but not for procedurally displaced things like you are doing.

What you have to do is manipulate the Normal input in the pixel shader with the same logic that you applied to the displacement or WPO. However, the normal needs to be perpendicular to your surface, while the displacement is just a delta offset. One way to solve this is to run two points through the procedural displacement, where the second point is offset along the vertex normal from the first. Then you can subtract the points after they have been displaced to get the new normal.

Doing this in the pixel shader (Normal input) is expensive so you can do it in the vertex shader with CustomizedUVs (will need to break the float3 normal into 2 float2 UVs) and then use the interpolated result in the pixel shader (Normal input).

Did you check out this: JBaldwin - UE4 Content Preview Thread - Work in Progress - Unreal Engine Forums

I think thats all you need :wink:

Cheers!

In code below XYZ components of result vector are normals.


static const float PI = 3.14159265f;
static const float g = 19.6f; /** In UU, 9.8f in meters; */

// Wavenumber calc
float k = 2 * PI / wave_length;
float w = sqrt( g * k );

// Count wave func 
float x = w * world_time - k * dot(wave_direction, world_position.xy);
float wave_func = (sin(x) + 1.0) / 2.0;

// Count height
float h = wave_amplitude * pow(wave_func, wave_steepness);

// Count normals
float2 dh = wave_steepness * wave_direction * k * wave_amplitude * pow(wave_func, wave_steepness - 1.0) * cos(x) / 2.0;

return float4(dh * normal_factor, 1.0, h);

You can check the Gerstner Ocean example from SeaCraft project (link is in my signature).

https://d3j5vwomefv46c.cloudfront.net/photos/large/848354193.jpg?1397421240

I’m having a very similar problem to this, but I built my shader in blueprint. I’ve made multiple attempts at different ways to get full 3D waves, and with none of them could I get normals working well.

I tried using just tessellation with multiple custom textures panning and a perfectly matching tangent normal map(baked in Blender right off the displacement map). It just looked…weird.

In blueprint form, how can I pick the normals(preferably per-pixel) off tessellated(or WPO displaced) dynamic meshes and feed them to the normal slot?

I found something that can get me hard normals(per-face??) which is really really ugly and won’t do as you can clearly see the triangles in the mesh. http://gyazo.com/14fca990c580a6c25e0c82f501e0e575
http://gyazo.com/f3992cf3d7a202a6cb2349ad98ec0dd5 (ignore terrible colors, I’ll make it pretty once normals work)
I’ve messed around with every little thing I could find on the internet and honestly made 3 different shaders with new ideas and each one I’ve hit the same wall.

Just like everyone else I was having trouble getting the normals right.
I managed to get it in the end. It’s still a work in progress, but I thought it might help out a few people at this point already.

I implemented it using HLSL in a custom node in the material.

The code below is adapted from several pieces I found that are all based on the GPU Gems article.
The two things I was doing wrong in the normals calculation was:

  • I was using the dot-product of the Direction and the WorldPosition, instead of the dot product of the Direction and the offset I just calculated.
  • I didn’t add the WorldPosition to the offset in the dot product.
float r = wi * dot(Direction,WorldPosition+offset) + phi * Time;

I missed that last part originally, but it is in the original article. Notice that in equation 9 it adds the offset to x and y to get the final World position of the vertex.

The “subtract from one” part in the normal calculation is done by initialising the normal variable at the start of the script as float3(0,0,1).

float3 directions[3];
directions[0] = normalize(Direction1);
directions[1] = normalize(Direction2);
directions[2] = normalize(Direction3);

float3 offset = float3(0, 0, 0);
normal = float3(0, 0, 1);

for (int i = 0; i < NumberOfWaves; i++)
{
    float wi = 2 * 3.1415926 / WaveLength;
    float WA = wi * Amplitude;
    float phi = Speed * wi;
    float Qi = Steepness / Amplitude * NumberOfWaves;
    float rad = wi * dot(directions[i], WorldPosition) + phi * Time;
    float sine = sin(rad);
    float cosine = cos(rad);

    offset.x += Qi * Amplitude * directions[i].x * cosine;
    offset.y += Qi * Amplitude * directions[i].y * cosine;
    offset.z += sine * Amplitude;


    float r = wi * dot(directions[i],WorldPosition+offset) + phi * Time;

    normal.x -= directions[i].x * WA * cos(r);
    normal.y -= directions[i].y * WA * cos(r);
    normal.z -= Qi * WA * sin(r);
    
    Amplitude /= Ratio;
    Speed /= Ratio;
    WaveLength /= Ratio;
    wi *= Ratio;
}
return offset;

Result: