I’m working on material function ‘LocationBasedAlpha’ that takes two world-space points and gives 0-1 alpha for vertices between these points… Maybe it’s better to explain by some images:

It already works, but let me show what I did here to explain the problem better. At first I had this function:

… And this result:

As you can see, it’s not fully red at the bottom, because this is what happens (it’s the bottom of this cylinder):

It’s because it compares distance on all axes, so the further from center of the cylinder, the further the alpha value goes.
What we need to do is to compare vertex distances only along this PointA —> PointB LookAt axis:

… And because I can’t figure out how to implement that, for now I’ve fixed it by taking LookAt vector from point A (point that specifies Alpha == 0.0) to point B (point that specifies Alpha == 1.0) [this way we have the ‘alpha axis’] and multiplying it by all the checked locations:

… This gives the correct result (as in the first image), but there is a problem:

As you can see, alpha values are calculated OK when mesh is rotated at 0 or 90 degrees, but they’re bad when it’s rotated by e.g. 40 degrees. The cube is there to help to visualize the problem - it has the same material, with the same ‘alpha points’.

So I feel like I’m close to finishing this function, but I can’t figure out what’s wrong with that.
… Or maybe there is some other, better solution for this type of location-based alpha function?

I hope that I’ve explained this in understandable way. I would be grateful for some hints, function like this (location based alpha) would be really useful in some dynamic interaction/generation cases, and probably for other projects too!
*
PS. The material is just two colors (red & green) Lerp’ed with the calculated Alpha value.*

For a start, I’d add a ‘transform position’ node after the world position node to convert it to local-space. that means that no matter where you move or put the object, the pivot point will always be at 0, and the z-axis will always point ‘up’ from the object’s point of view. super useful, and solves your second problem. then I’d do the distance comparison against the object radius, like so:

Masking that to only use the ‘z’ axis will stop you from getting the first issue you’re reporting:

However, this will only work if your pivot point is at one extreme of the axis that you want to measure. to mitigate this, you need to add the location of the pivot compared to the object. for example, an object with a pivot at its center would subtract half of the object’s radius from itself like so:

so for a variable pivot location you’d need to add a node here and adjust that. for you then this value would be half the object radius again, but as a negative value because you’ve got it at the base. so this step is irrelevant for your needs, but if you want this to be more flexible then it helps to put it here anyway:

And to make it so that you can have the ‘look at’ direction (which is currently the ‘z’ axis) be any axis:

then if you’re having issues with getting the min/max values right: the min value will be the pivot position. for the max value, you need to add/subtract from the object radius, or replace it with your own distance scalar parameter, like you have with your ‘input LocOfAlpha_1’:

the issue with that is the ‘look at’ vector is always in world-space. we can fix that with another transform node:

Thank you! I’ve tried implementing this with pointA and pointB as inputs (In my case i need to specify alpha from both points in world space), here is how it looks:

The alpha based on pivot + offset is really interesting, but it causes problems when objects have different pivots - or maybe I don’t understand something:

Both cylinders have the same material with the same pointA and pointB points. The second cylinder has its pivot in the center, and its completely red - while the alpha should go the same as on the first cylinder.

I’ve also tried the object radius node, but I couldn’t set it up so it would work with any object, maybe I’m missing something here…

Slavq, I think to get the effect you want, you would need to define a coordinate space, with origin at point A and one of the axis pointing towards B. Then you would transform position to this coordinate space, and measure distance along one axis only.

Yeah should be absolutely doable by evaluating projections on the axis via dots. No idea why i drifted too far into transforming from one space to another, my bad.