In my game the player moves objects and the player camera movements are determined largely by VInterps that were being controlled like this, what most tutorials show. However this causes the speed of the object’s movement to vary wildly if frame rate changes.
I tried to solve this problem like this, and it seems to be more consistent across different frame rates, but I’m unsure of this solution because I don’t know if this is an efficient solution or how to adjust the update speed of the timelines that I have to place in every blueprint that uses the Event Tick.
Using the timeline with a track that goes from 0 - 1 would be a good delta for a lerp. And a better solution.
Also, you don’t have to store Delta Seconds, you should instead use Get World Delta Seconds for gameplay mechanics. This will scale based on world time scale.
Do you know if I could get a similar problem with Update on the timeline though caused by changes in speed of different speed CPUs instead of frame rates? I wasn’t able to find this in the documentation. I wish there was a way to time things directly using miliseconds independent of frame rate.
Also looks like interp methods yeld different results on client and server. I think this should be added to their description so we don’t have to learn the hard way.