Getting less Tick events than frames

Hi guys,

I was implementing an auto-run feature and tried using the event Tick to achieve that.
Unfortunately, when the auto-run was on (just and Add Movement Input on event Tick), I noticed the event Tick was only called 10 times per second while I have a steady 120 FPS.
I even deleted all nodes from my 3d person character blueprint bar the Event Tick, nothing changed.

I created a new project, added the same tick count measurement code than in my development project and I get 120 ticks per second with 120 FPS.
What could be causing this?

I finally ditched the Tick event and went for this:

Unfortunately, if I use any higher value than 0,005 seconds in the delay, my character moves 1cm each call instead of running automatically.

That leads me to think there could be something really broken in my project but after spending quite some time on the matter, I can’t figure what’s wrong.
Any help will be appreciated, thanks!

Well, inputactionautorun is not an ontick event.

So was the code in both projects the same or was one the input action and the other the ontick event?

Regardless, using simplemovetolocation may be a better idea.
set it around 10m ahead and have it fire every so often (but less then tick).
Try it in a test project and see how it goes.
Maybe a recursive function for it is in order.
though you can just branch lock it and make the delay much longer just with the same code you showed.

I’ll have a look into a MoveToLocation solution, it’s interesting, but I’m still wondering why I’m getting less ticks than frames (with a proper Tick Event, not the custom Autorun event I pictured above).

Why just not use **InputAxis **for MoveForward. If you enable “Autorun” just force Add Movement input to 1 instead from Axis Value.

Works like a charm, thanks, I should have thought about that smacks forehead

I’m still wondering why I’m getting 10 ticks per second with 120 FPS though.

There is a tick for every frame, so at 60 FPS, you have 60 ticks per second. but Frames Per Second can vary, so Tick events have a float called Delta Time, which tells you how many seconds have passed since the last tick.
Not sure how you measured 10 ticks per second?

With this code snippet, I get 10 in my current project while I get 120 in a blank project.
Both projects display 120 fps.

This gives the same results:

Where you added this code? Not all stuff are loaded in the same time. Get Time Seconds returns time in seconds since world was brought up for play.
I don’t think this kind of measurement is accurate.
Create Set timer by Function name and loop it every second. In tick increase counter and restart it in your Function to 0 or 1.
Or better solution:

Do 1/ Delta seconds in tick.

The code runs in the ThirdPersonCharacter blueprint in both projects and checking the Delta Seconds as shown in my last screenshot should get the accurate ticks results anyway.

TickInterval might be set to something other than 0.0 on that Actor. If it is not 0.0 it will not “Tick” every frame.
You find **TickInterval **in the **Details **panel under the Category Actor Tick

Note that a delay set to 0.0 acts as skip to the next frame. I would not recommend using a delay for ticking though.
This is what i do

  • Tick every frame? Tick set to 0.0
  • Tick less than every frame? TimerHandle
  • One time frame Skip? Delay set to 0.0

This will show you FPS. You are printing delta seconds instead of FPS.

That was it.
Thank you very much, everything’s sorted now!

You’re welcome.

It is a bit odd that there is an option to change the Tick Interval but someone must have thought it was a good idea at some point.
To have an Event called Tick that doesn’t tick leads to this type of hidden errors.