Announcement

Collapse
No announcement yet.

Need help getting consistent times from DeltaTime

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Need help getting consistent times from DeltaTime

    I can explain more if needed, but here's the problem:

    I have a float set to .025. Every tick, I subtract the DeltaTime from this float. When the float is <= 0, I trigger an action and reset the float.

    In my mind this approach should be framerate agnostic (Besides a single frame of wiggle room), but if I time the results I have a noticeable difference in trigger rate from t.MaxFPS = 40 to t.MaxFPS = 120. Is this an issue in my approach, or am I misunderstanding how FPS affects things?

    #2
    Originally posted by Archduke_ View Post
    I can explain more if needed, but here's the problem:

    I have a float set to .025. Every tick, I subtract the DeltaTime from this float. When the float is <= 0, I trigger an action and reset the float.

    In my mind this approach should be framerate agnostic (Besides a single frame of wiggle room), but if I time the results I have a noticeable difference in trigger rate from t.MaxFPS = 40 to t.MaxFPS = 120. Is this an issue in my approach, or am I misunderstanding how FPS affects things?
    AFAIK UE4 does not use a fixed timestep, So the delta you are getting is probably the delta from the last frame.

    HTH
    Never say Never, Because Never comes too soon. - ryan20fun

    Frames Per Second is NOT a proper performance metric, You should use frame time. You can read this or this as to why.
    (Profiling) Tools: RenderDoc (UE4 Plugin), NVIDIA nSight, AMD GPU PerfStudio, CodeXL
    Good articles/Series: A trip through the Graphics Pipeline 2011

    Comment


      #3
      That seems to agree with how I understood DeltaTime (As the delta in time from the last tick to the current). It wouldn't explain my inconsistency though, since I'm subtracting the delta (An amount of time that has passed) from a float that represents a certain amount of time.

      I'm beginning to suspect that since my float is so small (.025, whereas 60FPS is .016s per frame and 40FPS is .025s between frames), the +/- 1 frame that I get from checking <= 0 actually creates a significant difference. In other words, a faster framerate would create a higher chance of triggering the action very close to when my float passes 0, whereas a lower framerate might far overshoot it and waste time. Does this sound like a possibility?

      Comment


        #4
        Yes, that's exactly it. Simple thought experiment: if your float == 0.3300001 and each delta time == 0.33 then you'd trigger the action every 2 frames when clearly you would expect to be triggering almost every frame.

        The reset needs to include this 'remainder', and thankfully that's easy enough: instead of setting the float back to 0.25 each time it's lower than zero and the event triggers, just add 0.25 to it.

        Comment


          #5
          Implemented your fix, and it worked! There's still a bit of variability, but it's small enough now that it isn't an issue. Thanks!

          Comment

          Working...
          X