How do I add a global post processing shader which projects the scene texture onto an arbitrary mesh instead of a screen-aligned quad/rect?

Title says it all: I would like to create a global post processing shader that projects its input scene texture onto an arbitrary mesh, such as a hemisphere, instead of a screen-aligned quad/rect.

How would I go about implementing such a shader?

I’m not sure how to do this, but I just want to check that I understand the question:
You want to make a material for a mesh that is a giant picture of your scene?

Edit: I re-read it. You want to make what would normally be a full screen post process into a post process that just affects part of the screen that has a certain mesh in it?

Sorry, I sould’ve explained myself better. What I want is a post processing effect that, every frame, takes the image the camera has rendered as an input texture, applies it to a mesh, renders that mesh and outputs the resulting image to the screen.

That should give me a way to deform the rendered image based on the shape of the mesh: By setting a cylinder or a sphere as the mesh I could make it seem like the game runs on a curved monitor even on a flat screen.
Unfortunately for me, I lack the experience in graphics programming to find a solution on my own.

From what I’ve seen so far I believe that creating a material that does this is either impossible or needlessly inefficient, but global (HLSL) shaders look far more promising. There’s a good up-to-date guide on adding some simple global pixel shaders on Caius’ blog, but setting up global non-pixel shaders (e.g. vertex or geometry shaders) seems incredibly complicated by comparison and I have yet to find any up-to-date information about them online, hence why I’m asking here in the Unreal forum.

And thank you for taking interest ^^

Did you end up solving this issue of yours? What did you end up doing?

In the end I gave up on learning to use global vertex and geometry shaders, but while looking for alternative solutions I found out that nDisplay’s Mesh Policy does exactly what I needed.

However if you want your scene projection to happen directy in the editor and without having to learn how to use nDisplay and Switchboard first, you can also just place a camera directly in front of a mesh whose texture is a render target of a Scene Capture 2D (which acts as the real camera in that case). I verified that the Scene Capture 2D is rendered before the main camera, so you’re not introducing a 1-frame delay this way.

Sounds really cool! Thank you for getting back to me!

If you don’t mind me asking, why did you give up on learning about vertex and geometry shaders? I am currently in the process of diving into it, and as a technical artist I am worried that it might be to overwhelming to grasp. If you are a graphics programmer or a technical artist yourself, I am worried that I am taking on a bit too much…

Haha, I’m not a real tech artist, I’m just writing my bachelor thesis about lens distortion in Unreal 5. Even today I know very little about graphics programming, which is likely the reason why I had such a bad time with global shaders in Unreal. The biggest roadblock for me is that I’ve no idea what I’m doing, and it doesn’t help that there are very few up-to-date learning resources online on global shaders in Unreal. It would likely take me many months of studying to get the hang of it, which wouldn’t be worth it for only a possible minor optimization.

But don’t let yourself be discouraged by my fate, I’m sure it’s gonna be a lot easier for someone like you to learn global shaders :blush:

I wish you luck, and feel free to keep me updated on your progress, or better yet, write blog posts about your findings

I’ve no idea what I’m doing

Lol, same… With the deeper engine stuff I sometimes look at code and feel a question-mark grow to the size of a football field.

I wish you the best with the thesis! :slight_smile:

1 Like