Hey , I think what ULLS is trying to say is that you should use a linear interpolation node, also known as a LERP.
It takes 3 input values, start, end, and a percentage.
If you haven’t dealt with this before it takes a little bit of mind bending to understand what it’s doing because of the time factor.
Basically for the first parameter you put the light’s current intensity level, and you create a variable as your destination light intensity for the second parameter.
Then for the third parameter you put in a floating-point value from 0 to 1 to act as what percentage of the way between the two you want the new value to become.
for example, light.intensity = Lerp(1000, 2000, 0.5f)
This is what it might look like in actual code, but it’s the same thing in blueprints
in this case you would be interpolating between 1000 and 2000 by 50%, so you would get 1500 as a result.
so you would want something more like:
light.intensity = Lerp(light.intensity, targetIntensity, 0.1f)
This is a little strange because you are making the variable the first parameter but also assigning the return value to that same variable but it makes sense when you think about it.
Because if you were to do like what I did above, put a constant value like 1000, you would get 1500 and it would just sit there every frame. But if you put something that varies and
is assigned to every frame you will get a nice smooth gradation.
So say intensity started at zero within a variable X.
X = Lerp(X, 1000, 0.1f) would give you 100, next frame would give 190, next would give 271 and on and on
Also it’s important to realize with your delay node you posted above, it is doing SOMETHING, not nothing.
I don’t know exactly how the timer code works but I would assume that you’re creating an infinite loop, if it’s not crashing im not sure why but you should get rid of that if you haven’t already to get more reliable results