Implementing a Screen-Space Projected Grid

I’m trying to implement a screen-space projected grid (Real-time water rendering - Introducing the projected grid concept - Habib's Water Shader) as part of a simple planar water system I’m developing, and I’ve gotten a bit stuck on how to actually produce the mesh.

I’ve been looking at UProceduralMeshComponent, but it seems like it’s designed more for meshes that are updated infrequently at best. I’ve also considered drawing a quad in a custom render pass, but I want to be able to take advantage of the standard material/deferred rendering system.

Any help is greatly appreciated!

To clarify, the main thing I’m trying to do is attach a grid mesh to the viewport and adjust the distance along the camera’s Y axis. Would it be possible to do this in a material using the vertex offset, or would it be better to add some custom code in the mesh pipeline?

Both ways are viable.

Do you have any suggestions on where to look for implementing the projection in the mesh pipeline? It seems like it could be viable to modify vertex position calculations in BasePassVertexShader.usf, add an extra preprocessor definition to switch to screen space projection, and somehow add a parameter to define the plane onto which I’m projecting.

Could just make it material.

Well, I’ve tried messing around with the AttachMeshToTheCamera node in combination with the CameraOffset node, and it kind of does what I want with a few caveats:

  1. The mesh is only rotated and translated, not scaled to fill the screen.

  2. The mesh can only be projected relative to the active camera using these nodes. If, as in some implementations of screen space projected grid, I want to limit the camera angle (to prevent using camera rays that never intersect with the plane) I can’t do that using these nodes.

I’ve also had to set the bounds scale on the mesh component very high to ensure that the mesh is never distance culled. I’ll look into possibly injecting some custom HLSL code and seeing if I can get better results that way.

Nope and nope.
You have view parameters accessible, so you can scale it any way you want to fit given FOV, aspect ratio and near clipping plane. All you need to do, is attach mesh to camera in a known position and orientation.
Likewise, nothing stops you from supplying basis vectors of virtual projector as material parameters.

I mean, that is what my setup does, there’s not really a question about that :stuck_out_tongue:

After a bit more testing, it seems like I may want to try and a vertex shader onto the mesh pipeline. I’m looking at Unreal Engine 4 Rendering Part 1: Introduction | by Matt Hoffman | Medium to learn more about Unreal’s rendering process and I’ll try to post an update if I can figure it out.


I’m new at UE4 and I’m trying to do exactly the same thing as you, but so far without success.
I came from Ogre3D where I have implemented this grid to simulate the sea surface. I used GLSL and all went perfect as shown in .
I’m trying to use the same technique in UE4 but it seems to be much more complicated. What do you say if we try to work together and put this to work in UE4?