Hey guys, so I’ve come over from Cryengine and realized I need water.

This is what I’ve got so far: Gyazo
As you may or may not have noticed, the normals added from the textures are fine, but the normals that are derived from the tessellated displacement/WP offset are 100% sharp.

I need to get smooth normals, or smooth them after the fact, either way. This end result is an improvement on what I had before, but looks ugly if I displace it more than a little bit.
When I couldn’t even get the normals of displacement or world position offset, I did some research and found this snippet and that worked.

How can I get smoothed normals from world position offset and world displacement?

Thanks,
Seth

If you need more pictures then just ask, I would show the whole blueprint but I have no idea how to take a screenshot big enough.

DDX and DDY return the screen space derivative (aka slope) of an input function. They are very powerful functions but you need to understand their limitations.

Mathematically, the slope at any point on one of your mesh’s triangles is going to be a constant, hence the facets. In some cases you can normalize the input or other tricks but in this case I think you will need to solve the offsets in the pixel shader manually.

The material function called createnormalfromfunction is a bit painful to use but it should give you smooth results. You basically have to hook up the same function 3 times with offset coordinates supplied by the function (uv1 uv2 and uv3). Uv can be worldpos if thats what you put on the input side. You need to get the “height” value the vertex shader sees for each of those corresponding inputs.

With more finesse you can also probably get a more efficient result if you know how to calculate the derivatives of your vertex shader functions. That varies function to function but they are usually available.

Tomorrow I can get your some info on that function.

For now, I had a simpler idea to get rid of the facets using the ddx ddy method you previously used.

Select the world position node and select the (excluding offsets) option. The place an add node and put the worldpos(excluding offsets) into A and into B plug in whatever you were plugging in to worldposition offset.

Instead of getting the vertex solved world position shader you should get a per pixel value hopefully

That screenshot with *cross(ddx,ddy) * is from a material which is meant to render a mesh/landscape flat shaded on purpose, so basically the opposite of what you are looking for.

As RyanB pointed out you can calculate the normals from the world position offset wave function (assuming it’s only offsetting the mesh along the z-axis). Comparing 2 heights offsets with the original position to get the slope works well for a texture height map but you really don’t want to do the whole wave math 3 times for different positions. Instead you should calculate the x/y derivatives of the wave function, this will give you the normals x and y component. It’s up to you how to calculate the derivatives, either doing it manually or using ddx/ddy.
Things might get pretty expensive if you are combining lots of waves with varying size or speed, but you can always limit the normals calculation to a certain amount of waves, e.g. the 2-3 largest ones.

RyanB, with your method I was able to get good lighting and a result that looked really good, and it is per-pixel. But it looked artifact-y and upon further analysis breaking the displacement map down I ran into this (picture) which is most likely causing my ocean to look noisey (picture).

How would I either smooth smaller normals or get scene per-pixel instead of texture per-pixel? I tried messing with normalize and flatten normals hoping I could get an improvement, but that didn’t help at all.

mAlkAv!An, I’ve read your post multiple times, and as a high school student who hasn’t done anything with calculus, most of what you’ve said just went right over my head. I’m trying my best to learn and understand this stuff as I go.

Also, is there any way to capture a screenshot of my blueprint that’s high resolution enough for you guys to read?

Also further speculation on why the lighting looks off is that there isn’t any reflection being rendered for any wave that faces backward from the view of the camera. My guess is that if I could use a cubemap with the SSR rendering on top of that that I would get far more realistic lighting. However I have no idea how I would achieve this.

I kind of expected that. As I said, DDX and DDY have some limitations. You could try multiplying down the intensity the WorldPositionOffset value that is added to the Worldposition(exludingoffsets) since maybe the amount of height being used there is too great. Or it could be pixilation induced by how DDX, DDY operate. Those values exist because the GPU needs to know the derivatives of material functions to calculate things like mipmaps. Since the hardware already has the data we can get access to it cheaply. The values are solved in little 2x2 blocks which means if the math going through there is very high frequency you can get a noisy or blocky result. It is possible that there really is that much noise in your math function but the vertices were acting as kind of a lowpass filter showing you a nicer macro pattern. I have seen that before several times where the materials tiling pattern was actually set to be an order of magnitude too small but somehow a macro pattern that looked somewhat correct emerged at the sampled resolution and the material almost passed scrutiny. Crazy stuff

I believe you are actually seeing a separate issue here which is that you are only seeing Screen Space reflections. They can only reflect what is actually on screen thus you get a weird ‘disappearing reflections’ effect as you tilt the camera up and down. You need to place a sphere reflection actor and either add a “skylight actor” set to capture the level’s sky dome or you can specify a cubemap to use in the skylight. Alternatively, you can specify an ambient cubemap in the post process volume or specify an environment color intensity in world properties. For all of those options you will need to rebuild lighting using lightmass and preferably add a lightmass importance volume around your immediate level objects.

Not currently although I have seen it asked many times around here that we get support for higher resolution images working. For now you have to just host images on an external website.

Basically you have some sort of function for the world position offset, let’s say multiple sine nodes. If you calculate the partial derivatives of this function you’ll get the slope, which is the same as the red and green channel of a normal map.
DDX and DDY can calculate the partial derivatives for you but only in screen-space and in blocks of 2x2 pixels. This might be the reason for your rather noisy results.
To get around this issue you can manually calculate the derivatives in the material (which requires some basic knowledge of calculus).

The basic gist of it, is that the Noise function is duplicated 3 times. The “position” input on each noise function is UV1, then UV2, then UV3. Note that I had to append a 0 to the UV1, UV2 and UV3 outputs since noise uses a V3 input but the UVs are V2 since this function creates a normalmap from 2D coordinates. They all must use all other parameters exactly the same, hence "Offset is plugged into filterwidth for each. If your worldposition offset is a big complex shader, you would actually need to duplicate the whole thing 3 times replacing andwhere that WorldPosition was used with one of the 3 UV outputs, then you would link the output of each version of your worldposition shader into Function(UV1), Function(UV2), Function(UV3) etc.

Notice that worldposition.rg is specified as the Coordinates which means this can still operate on worldposition just fine.

Inputs:
UVs (V2): Coordinates to evaluate function over.
Height Map UV Offset (S): This is the delta x,y offset value used in the function. You will rarely ever need to change this. Smaller numbers result in more accuracy until you reach the point where precision creates even worse problems (seems to be when less than 0.00001).
Normal Map Intensity: Strength of the normals. Only if you want to change how strong the result is. Generally it does a good job because the intensity is dependent on the offset which gives “correct” results.
Function(UV1): Plug the 1st copy of the height function here.
Function(UV2): Plug the 2nd copy of the height function here.
Function(UV3): Plug the 3rd copy of the height function here.

Outputs:
Normal: Resulting normal map
UV1: Plug this into the 1st copy of the height function. This one has 0 offset.
UV2: Plug this into the 2nd copy of the height function. This contains the X offset.
UV3: Plug this into the 3rd copy of the height function. This contains the Y offset.

mAlkAv!An is correct when he states that manually calculating the derivatives using calculus will give superior results. Much faster generally and won’t require duplicating a whole tree of nodes 3 times. Sometimes though the math is out of reach or you just want to see what it would look like first before deciding whether to invest the necessary time. That’s really what the normalfromfunction function is for, it is a bit messy and expensive to do too much with it.

This is copied from the JBaldwin wave thread where he integrated the Position part of the following math:

So in this case, Position would be the math used in the world position offset shader. B= binormal, T=tangent, N=normal.

Those last 3 would all be considered the derivatives. You could elect to solve any one of them and then re-formulate the other two using cross products, or you could solve all three to get the most robust possible results. Usually we end up solving one derivative and then derive the other two using a cross product with 0,0,1 and the function “CreateThirdOrthogonalVector”. The obvious weakness there being the result fails if the vector is ever exactly 0,0,1 but you can solve that using clamps or Ifs generally.

For some functions like cos(x) the derivative is simply sin(x). So for some shapers it is easy. You should try messing with an online derivative calculator they can be very informative. Or just search about finding the derivative. What the normalfromfunction route does is test 2 points on either axis manually rather than solve the derivative using calculus. Kind of a brute force way of doing things really.

(edit: those attached images below are just redundant images I cant seem to remove).

I’m Trying to generate the normal from 3D noise Vector. (curl noise actualy).
I reallllllly put efforts on this. but i don’t understant why a simple cross product doesn’t return correct normal direction.

One suggestion would be. What exact effect are you attempting to get. Could you show someone a reference of an effect with a similar vision. There is so much work that has been done with liquids, and fuids. What your trying to do could only be a few clicks away with something like nvidia gameworks waveworks. Blender fluid sim with alembic. Etc. I spent days staring at my screen trying to make a beautiful “cartoonesque stylized” water when it was right under my nose. You can look at the ocean modifier in blender. There are also people that are using morphs to create ripples and waves. I havent done this yet with engine, but I heard that performane wise it was cheap. I watched the moana making of/behind the scenes several times. Im not saying i’m lazy, but that degree of perfection is a bit too much work for me to attempt. I dont want to spend six months making a less perfect water like that to find it being a plug in for the engine in 3 months. Lol.

I thought about it again and realized I misunderstood what you are trying to do. Trying to get the normal from the displacement of curl noise itself.

I think the problem all just comes from how you are doing your offsets and then transforming to tangent space to alter the Z based on the magnitude.

A simpler approach could be to apply your magnitude to the curl noise itself and add that to the coordinate basis. Then use the cross of ddx,ddy like you would with worldposition to get a faceted normal. That will restrict the derivatives to the surface itself rather than sampling them in full strength 3d only to flatten the Z. There is probably a way to do it without ddx,ddy as well.

To clarify, I want to apply a 3d noise offset to a tunnel shape and correct the normal so i can use a fresnel effect to fake SSS.
It’s could be any kind of noise or function.
Offset along the Normal direction doesn’t give wavy enough result, that’s why i try with 3D noise.

Unitl now i’m using the distance to the center of the tunnel to enlight the texture. It’s working pretty well but I realllly d’like to know how to generate normal from 3D function since
I’m only using real time effects so it’s a must have for futur projects.

Here is the tunnel with 3d curl noise apply to it.

I could also use Perlin 3D Noise, but don’t found information about the relation between it Gradient noise and the normal map I’m searching for.
any clue ?

I’ll try my best to solve this.
Thank’s Ryan for the advices, I’m trying right now to reorder things.

Thanks to your material I’ve been able debug and compare with a final proper result.
So you have to sample the noise 2 times with an offset in different direction.
Offset distance change relativelity to the scale of your noise so you can chose the noise resolution you want.
Then you add the offseted noise to the actual position on the surface PLUS the offset distance.

So you have to be able to tell the the position of the neighbor surface. For a plane it’s simple, also for a tunnel.
The example work only with plane and hence it’s calculated in World Space, no rotation is possible but I’ll post the final version.

On top DDX/DDY technique. Bottom : Normal From Function.
The result exaclty the same, with better details on sharpe angles.

Ryan, thank you again.

PS : Ive been inspired by your example and now use a grid like texture, and a Secondary colored light. It help al lot !

Excellent! I am glad you were able to debug for the proper analytical solution. Looks very nice.

Looks like removing the scaling from Z and tangent space was important. It may also save you a few instructions to disable the “Tangent Space Normal” option in the material. That way you don’t need the final World->Tangent transform. There is a chance the compiler notices and removes them but worth a shot and will simplify the graph a bit.

I used a Tunnel Mesh spawned at world origine,looking at X axis, position with normals looking inside.
We need to know the exact position of surrounding pixel in World Space to sample the Noise function 3 times to determinate the Normal.
So using this effect one a complex mesh geometry isn’t possible with the sampling approche, DDX and DDY Technique should be use instead.

We have to use Flat tessellation, since PN Tessellation automatically add an offset to the vertices to smooth the surface, and I don’t know how to get this value to
sample the noise at correct position.

I Offset the vertices of the tunnel to create perfectly constant radius.
Same technique as before to calculate the normal, here I need a sample with an offset on X axis, and a sample with an offset of the same distance along the tunnel radius (same X position).

Working perfectly, Shaper than with DDX / DDY !
I integrated it on the material for Debugging purpose.
Looks really nice !! No artefacts on the effect, it’s just the texture on color channel.