I want to have a 2d area with a generated perlin noise on it and darken most of the final perlin image. On the few perlin bits that are still showing I want to spawn an actor for each remaining cloud or blob. Is this possible with blueprints already and I just don’t know the proper vocabulary?
There is no such node. You’ll need a clever solution.
One ( not clever ) thing I can think of, is doing it with Niagara. If you spawn in a grid, the system can read how bright the texture is at each point and change the mesh material accordingly.
Thank you, I have been watching tutorials on niagara all night and morning and still haven’t put the pieces together of how exactly to achieve what I need but I think you are right and niagara has the functionality in there to do it. I haven’t found anyone show any examples of reading the brightness of each point so I am still a bit lost but will keep looking.
I will put something together, it’s an obscure area.
Bit later though…
There’s an importance sampling node:
But that does random sampling and is more sampling & distributing than accurately finding “each remaining cloud or blob.”
If speed isn’t an important factor, you can use Read Render Target Raw Pixel | Unreal Engine Documentation to read each pixel of the texture.
Edit: just read the importance sampling node is slow, too.
If you spawn the particles in a grid, you can read the texture alpha at the grid reference. I’ll come back later with an example.
Reading pixel data is prohibitly expensive at runtime…
That’s usually why other solutions are generally applied…
Not too bad actually
This ball of sand is made from 100k cube particles and runs at 120 fps:
( It’s getting the ball shape from a noise texture ).
I wonder if they used a different technique to read texture values in Niagara.
Usually file I/O is file I/O… the base cost is immutable.
More importantly, the base cost of accessing a pixel out of all the pixels in memory doesn’t really change.
But you are correct, Niagara runs pretty darn fast.
That ‘they’ was me. That’s my ball of yellow cubes
It was just a texture sample module, but there are two ways you can do it:
-
Random particle placement. Then you have to pass details to the material to get the alpha
-
Fibonacci lattice ( which this ball was ), in which case you don’t actually need to go to the material, you can just read from the relevant point in a copy of the texture.
I take it back, it’s only (2) in fact…
This is what I mean. You can spawn cubes ( or another mesh ) at the high points of a texture:
Is that what you mean?
Now with smaller particles:
On the whitest part of some noise:
That becomes essentially the same as water.
The calculation of the wave happens twofold.
Once for a material, and once for whatever you place.
Performance on that is always A++.
How’s the reading of the texture?
There’s a Niagara module ‘sample texture’, which takes a texture directly and will sample it at UV coords. Problem is, it doesn’t give you a useful value, so you have to hack it to get the alpha. Pretty straightforward actually.
I also find it almost impossible to construct meaningful expressions in Niagara parameters. The default method is a bit clumsy. But it’s ok, you put set them using HLSL.
There’s another module ‘sphere location’, which has a uniform option. You can also hack that and reverse engineer the UVs from the polar coords.
So it all stays in the emitter.
I know zero about HLSL so I will need to figure out how to use it then it sounds like? Do I need HLSL with this Sample Texture to achieve the first example you showed?
The first example of the large squares going to the brightest spots is what I need. Once I get to that stage I can probably figure out how to get a random variation of where exactly within each square it would put the point down so it will look less on a grid.
Edit: to clarify, if you could take a snip of where the grid reference is located I think it will make more sense to me
The crucial parts are:
If you click on the ‘eye’, you can get the modules to show you what parameters they write to:
And, as you can see, the grid location module ( which the system gives you to go with the spawn module ) writes to a UVW space:
What this is saying, is we get to know for each particle, where it is on the texture. Exactly what we want
The we put the ‘sample texture’ module in, so we can read the texture. You’ll notice, I’ve made my own copy so I can get just one channel. That’s only because I couldn’t figure out how to connect the SampledColor from this module with later stuff. There probably is a way, but I don’t find the parameter setup in Niagara very intuitive.
I also had the same problem getting the UVW from grid locaton into just UV, so I used a bit of HLSL to do that:
All it’s saying is ‘use the UV values from the UVW here’. If you know how to do this in Niagara params, show me!
So now I have the red channel ( how bright the texture is ) I can bin the particles that don’t fit my needs using the kill particles module:
The edit of grid location to get the red channel is also easy ( do it in a copy! )
Tell me how it goes. Don’t spend too long fiddling. If you get stuck, I can put this in a small project for you to download.
Thank you very much for going into such detail!