I can explain more if needed, but here’s the problem:
I have a float set to .025. Every tick, I subtract the DeltaTime from this float. When the float is <= 0, I trigger an action and reset the float.
In my mind this approach should be framerate agnostic (Besides a single frame of wiggle room), but if I time the results I have a noticeable difference in trigger rate from t.MaxFPS = 40 to t.MaxFPS = 120. Is this an issue in my approach, or am I misunderstanding how FPS affects things?