Currently I spawn an actor at the end of the trace (when it hits “floor”). Actor is pretty big (scale 1) as when it’s at the max distance it appears of a good visible size. However, when it’s near the camera it looks too big. 10% of the original scale should make it look nice when near the cam.
The idea is to scale actor based on the distance from the camera (which is already easy to get). When actor is between red vertical bars (see sketch above; along blue double ended arrow) it should be scaled from .1 to 1 based on the distance. Once it goes closed to the cam or farther from the cam (passed red vertical bars on the sketch, along red arrows), scale should remain constant (using last value).
How do I do that (changing scale value withing the bounds and keeping it as it passes set thresholds in both directions) ?
OnTick have a raycast between your target and your character camera component.
Get the length of the raycast and feed it to a variable named currentDistance.
When currentDistance is longer than startScaleDistance, don’t scale the actor.
When currentDistance is smaller than startScaleDistance, start scaling the actor.
Still OnTick if scaling is on, set world scale of your actor to currentDistance / startScaleDistance.
Basically at startScaleDistance scale is x1 and by the time it gets to farthest distance scale is like x13. I need to have scale x1 at starting point and scale of x2.5 (or whatever I define, depending on field test results, so to speak).
Currently I feed value of currentDistance / startScaleDistance into Make Vector and then into Set Relative Actor Scale node. If I normalize the value, then when 0 is fed into Actor scaling node, it will see no actor (perhaps I am not fully understanding how normalize to range works).
effectively what is occurring is that I have a sphere collision component that, when overlapped, sets the tick to enabled so the actor only ticks when the player is within the collision sphere. On tick, it gets the length of the vector between the cube mesh and the player, then sets that into normalize. What normalize does is it takes a value and checks where it is between two fixed points, min and max. In this case, it checks to see if the length is somewhere between 0 and 1000, or the full diameter of the circle. That value is then turned into a float between the two values, or, another way to view it would be the percentage between 0 and 1000. So 750 would be .75, etc. Because I wanted the object to grow, not shrink, as I approached, I then subtract 1 from the value, so 1000 will be 0 and 0 will be at -1.0. Get the absolute value of this and you should be able to plug that directly into your scale.
So, basically it scales from 1.000 to 1.005, where I need to scale up from 1 to 2.5 (or from 1 to any higher value I determine usable, 1 to 4.2 for example), where at 0 length scale is 1 and at 1500 length scale is 2.5. What am I missing ?
The normalize to range node is used to take values and put them to a 0-1 scale, the 1.005 may be a float point error. If you need something from 1-2.5, you will have to use the clamp as described above.
Basically I feed distance delta into Normalize To Range node, then feed the output into Lerp node’s Alpha with min/max set to whatever min/max scale I need to set actor to and lastly I feed Lerp’s output into Set Actor Relative Scale 3D node. Works like a charm