How to measure 1 second in execution ?

I want something to happen every one second (or any other time). but. have 3 problems :

  1. If i add from tick’s delta seconds to some float the represent time, then its bound by time but 1 in that float is not 1 second. it will take like 3 and a half seconds for that float to get to 1l. (no matter how much fps is paying ofcourse)

  2. Set timer by event to 1 will give 1 second every one second, but i want the rest of the code in the event to have higher call count like 0.02 which are 45 calls per second …

  3. if i combined between the 2 its messes up execution a little its start to behave wired

How can i get 1 second from tick easy ?

1 Like

Timer is the way, just have two timers. One for you per second code, and one for the other code.

Incidentally, if the system gets busy, or you run this on an older PC, having very finely set timer nodes probably won’t work anyway.

The system will do it’s best to organize things, but if you have a bad frame rate, you’re going to lose accuracy, especially if you hang something complex on the timers.

3 Likes

agree with clockwork.
but also to add something extra, you could consider make your own (tickable) component for each of those things that need to happen.
if it makes sense to have them as component, that is.

3 Likes

Thank you

I basically use that to reload a tower in my tower defence. its the time that takes it to shoot.

After the reload time, there is a system of detecting enemies and performing object pooling on the projectiles. Right now its all one path.

FPS is not a problem, game is low poly and highly optimized content and draw calls.
But multiple timers can cause issues, because right now there is one execution path that: reload → detect enemy - > manage the projectile pool → shoot, and its working, i just dont know exact reload times.

If i split that to 2 different execution paths, it can get messy. one path is reloading but the other path of execution didn’t got the enemy on time ect. it can become very difficult in theory

@RedGrimnir hope its ok im tagging you here, you think the solution of the gate you suggested with my previous post can help with this issue?

1 Like

Well first of all I will keep it clear than I will go into details.

Short : Timer is the way. From tick if something goes wrong, it can miss actual time. If you don’t know reload time, like its in animation etc, you can always have events like animation finished etc. to continue your scripts logics. It will maintain the order in your scripts that way.

This being said, accuracy wise, tick addition is way more less accurate than a timer. Let’s say you are in a tick that is about to go 1 Second : 0.98. You don’t know computers next tick, can bee 0.02 or 0.04… if its longer that what is actually needed than now you already have a 0.02 error. Timer on the other hand uses engine’s libraries to calculate time which are far more high precision. I don’t have a very in depth knowledge about it but It should be accessing low level platform time of device so its better.

Long: In physics you can not really measure something. Time as concept is a concept not measurable already. To measure it like in many subjects in mathematic we do use slicing methods to accurately calculate things or atleast be predictive about it. When it comes to time for centuries humankind we tried to calculate time with various methods , from solar dials, to pendulums, to quartz osilators to atomic clock. So it is all about the resolution of the slices in time what defines its accuracy. Even with the machines its not accurate enough and think there will be more inventions around it. So question comes to is Windows Linux IOS clock are accurate? Well not really :slight_smile: afais however they are synced to various services for syncing, afais BIOS clocks are quartz so not very accurate. So generally ONE SECOND is not one second. A thing that can blow your mind is, what makes one second? Why unit is unit? Did you know that one kilogram is actually have a physical origin (father) and all metric system is build on that. So this can go on and on as many famous scientist took part in this subject there are many aspects to measure something accurately, especially hard for us to talk about time where we quite recently discovered relativity.

BUT you don’t have to know ONE second actually is ONE SECOND to make your game work :slight_smile:

All things @ClockworkOcean and @nande said is right.

2 Likes

Here is some interesting videos about it if you find those things interesting like I do :slight_smile:

The kg is dead, long live the kg

How long is a second?

1 Like

btw, just to add some info you might want to know.

  • timers (as well as tick components) are affected by time, dilation. if you need something precise you might want to do it differently.

that sentence is a bit ambiguous to me. but timers and ticks are virtually the same.
they will get called in the game thread, when their “delay” expires.
so if your fps is bad, you wont get a precise timer.
it will never interrupt the execution (that would be terrible), and if it did you’d have to worry about re-entrancy. in which case a component will be better, even if only to manage the timer.

a timer, and a tick interval, means:

  • it will happen in AT LEAST this amount of time. no precision or warranty.

having said that, it’s super unlikely a tick interval won’t trigger at all.

also tick interval work better with the lifetime of objects.
i had a crash because i had a recurring timer and i forgot to clear it when the object was scheduled for destruction. (before it actually gets destroyed).

I’m sorry, but i don’t think that’s how it works.
i’m not sure why you assume that it “should” to behave in certain way. ( https://www.youtube.com/watch?v=RpXyy2RLnEU )

i just checked the code. the timer manager ticks, like any other object. then tracks the time with addition. just like a component.
There’s virtually no difference between them.

the other advantage of ticks vs timer as that you can pause a tick.

the original post does not states what it tries to do, so i can’t provide a better answer.
but in terms of timer vs tick, i’m confident in what i just said.
if you need that kind of atomic precision and a frame is too much of a difference, i think you need to re-evaluate what you’re trying to achieve.
maybe you’ll have to implement your own time tracking or account for the difference.
you can implement your own time on another thread and read the platformtime, or call windows’ gettickcount (Which could be more precise than platform time, since it has the same amounts of bits, but a fixed precision, but certainly not portable and a headache).

i have hopes that your goal can be achieved with a regular timer/tick if it’s engineered properly.

1 Like

Sure thing, I just assumed, didn’t even go into FTimeManager, but I think in the engine somewhere anyway talks with system time since you can get those, which is out of subject after reading your comment.

Well then atleast we know that now, in original post I assumed that actualy time calculation of as unit of 1 second is asked.

the other advantage of ticks vs timer as that you can pause a tick.

Think you can pause timer too? Whatever I just agree that anything can be done without too much precision.

1 Like

yes at some point it does.
you have the FPlatformTime::Seconds (as it’s visible on the screen)
and there are other methods to get the game start time and the current time.
i don’t remember from the top of my head.

but now that i think about it, it does not matter, almost at all. unless you dive into multi-threading.

you’d have to implement your own timer, and track and compare with whatever time you track.
but, if you want that to happen on the game thread, and you’ll mostly do, you can only do so in between frames.
which means, your resolution is back at where the frames are, now with added overhead.
so, it’s the same resolution as a timer, but now with added overhead.

if you were to use AsyncTasks to communicate with the game thread, now you’ve lost all accuracy, and can be delayed by additional frame(s).

the only way to implement an accurate timer, would be to implement your own thread (not async task).
then on the thread have some sort of either spin lock or smth. but again the spin lock can incur in inaccuracies.
so you’ll need to access the operative system’s timer interrupt. which i’m not sure you can easily.
maybe you’re lucky and ue implements an api for that, but i’m not confident.
but even if you do do that, what do you do then?
you’re in another thread.
i’m not confident you can even access physics state from another thread, as it might be performing a calculation.
and if you wait for the physics to finish (which would be difficult) you’ve lost accuracy again.
and you’ll have to be super careful with locking items so that the gc doesn’t kill them or take them from you. which, again, shifts the time, and it’s difficult and error prone.
and there are a ton of things you can’t do on another thread. like creating objects, destroying, or certain modifications. (like changing the transform iirc of certain objects).
so, you’ll need to communicate to the game thread. and you lost accuracy again. even if you use some sort of thread lock, if your timer really needs to be accurate, a thread lock is a random amount of time.

and finally, the physics are calculated in the same timesteps as the timer. so let’s say you dropped a frame, and the DT is 0.0166666667*2 , then the physics will be calculated in that DT (afaik).
and your accurate timer won’t be able to work with that.

so in conclusion, unless you truly truly truly require an accurate timer, i’d try to engineer my problem so i can use the regular timer.

afaik, nopes. you could fake it, but it wont be accurate.

I do not know what you want but look what I did for you:

1 Like

Everybody says “timers” but it sounds to me as if you’re having a game logic question, not a timing question.

Your game actions all have some kind of reference times – “when the reload started running” for example. What you need to do is to store the reference time on the action (or your pawn or player controller; whatever makes sense.) Then, in the action, get current game time, and compare to the stored “start” time, and that’s how far into the action you have progressed.
If you animate a reloading action, you’ll want to calculate the elapsed time in each tick so that the animation can render smoothly exactly once per frame.

There is another helpful mechanism you can use, too: The actor timeline. Right-click the event graph and choose “create timeline.” Then, you can set events that get fired after a certain amount of time has elapsed on the timeline, as well as at the end of the timeline. When you start re-loading, simply start playing the timeline.

2 Likes

I can waste an entire day watching Veritasium channel, i love deep thinking about what we already defined as “common sense”. Your explanation was very interesting and knowledgeable thank you.

If timers are the most accurate thing we have i will use them.
Maybe i can invent my own time measuring unit of 1 which is close to a second and use that as my game’s 1 time unit that is based in the timer.

maybe i can make a new project and let timer of 1 “compete” with a timer of 0.02 and match the ladder with a reset value that will match the first one (incrementaly increase \ decrease it until i get the reset value righ

You kinda open my mind about this issue tbh.

Bro i was mentioned Veritasium before seeing those links, im replying top to bottom ! :grin:

Thank you for the in depth explaination. That reply made me understand i need to lean C++ in unreal as well. This VM of blueprints makes me feel i have no control over what i need in the game.

I do feel this is a bit above my current understanding of unreal to execute properly

1 Like

That was actually my next post im happy you mentioned that.

Is that the same as animation events?
If so, isn’t animation events are less safe and should used only for playing particles and extra stuff?

I do want to delay the start of the fire action to half the time of the the animation itself.
So if the animation is 1 sec (it is 60 frames, 1 sec), i want it to start to play only after 0.5 sec. so both of them are being played constantly (both the animation and the code), but because the code is 0.5 sec delay, it runs on 0.5sec offset which is good to me.

Can animation events achieve that? (on frame 30, i set an event to start shooting)

1 Like

Ill test that on a new project, and see how it worked for me as well. Thank you for that example

1 Like

Not quite. Animation events are fired by running animations on some object. Meanwhile, timelines are a native object within the actor event graph itself.
An example is here: