I currently have a mask that I create inside a material. That mask is used to mask part of a mesh inside the scene based on the location of an actor. The tricky part is that once it’s done, I wish to spawn particles with Niagara on the very edge of that mask.
I tried to sample the color of the mesh, but it only get vertex color, not the actual material. I also looked for ways to transfer the mask information out of the material but it doesn’t seem possible.
If anyone has an idea how to do this or how to fake a similar result I would really appreciate it.
We don’t have a way to get the data out of a material yet, but you could pass in the mask texture to GPU simulations. You could then do rejection sampling (i.e. spawn a lot of particles on the surface and immediately kill anyone that doesn’t meet your criteria).
Sadly I’m not masking using a texture. I use a sphere mask with the actor location as an input and multiply that with a noise (plus a few other things). So I can’t simply reuse the mask with a texture sample in niagara
You could either reproduce similar logic in the Niagara system or you could render the mask to a texture and use that as a lookup for the GPU sim.
I didn’t saw any sphere mask in Niagara, and I must admit I lack the knowledge or recreating one from scratch. Would you know how or where I could find that information ?