Hi all! Let me preface this by saying i don’t know what I’m doing. This is my first project in unreal, so please have patience with me if I am missing obvious things or not explaining myself well.
My Goal:
I have a concept in mind where point in my 3d world will generate localized post-processing. basically a visual echo that the player can see from a distance. going deeper into the concept, i want these to be generated many at once and many times per minute/second. These visual effects have a spawn position in the world and a scale that should be dynamically editable per tick.
What I Have:
I currently have a post process set up with a mask where the effect is lerped using that mask to only display where wanted.
My Problem:
Making the mask is my issue. that mask has to be created basically as a conglomerate of spherical masks centered at all spawn points and scaled according to each. The issue becomes one of passing that position and scale data to the material as a long ‘array’ of float sets.
From what i can find UE5 does not provide a way to directly pass primitives, much less arrays of objects to the GPU for material rendering.
What I have tried:
The closest I have come so far is using a canvas render target, and encoding/decoding the values in RGBA. I still think this could work, but my limited knowledge is showing. I’m struggling to understand and figure out writting to the Canvas Render Target per pixel, and read/translate that data in the material graph.
Screenshots: