High Accuracy Timing, Tick Rates, Etcetera

Greetings,

I have recently started working with timers versus the default Event Tick and had some curiosities. I posted on the Unreal Engine 4 Developers group over at Facebook about this, but wanted to get a deeper insight here to completely understand this topic. My goal is to have a highly agile tick, using timers. For my basic example, I’m making a clock. Currently, it houses seconds, minutes, hours, days, months and years. I’d like to possibly get this down to milliseconds, but it’s not a “must-have”.

So here is my first curiosity… I’ve been seeing more and more that breaking away from Event Tick and using our own timers is good practice, which is why I have been moving over to that. I also want to achieve as much frame-rate independence as I possibly can. Someone replied to my post on Facebook, saying that timers can’t achieve a higher tick frequency than frame-rate, which is a bit disappointing if true. Is this true? If so, how would we go about getting an accurate and frame-rate independent tick? I’m okay working with a common 60Hz tick, but desire the option to tick faster.

My second curiosity is related to the built-in DateTime structure. I notice that all the variables are integer types. For the system I’m cooking up, I need to be able to add floating point values to the days, which works fine for my own variables/structure that’s using floats. However, I love the functions available from the built-in DateTime node, such as DaysInMonth, IsLeapYear, etc. As I add time to my own clock variables and to the DateTime structure, everything seems to work well- until I speed up the time rate (with a multiplier, not using time dilation). At a certain point the DateTime structure variables start to lag behind. I’d like to be able to fast-forward time up to 1,000x or more. I know most games tend to have 2-30x faster time rates than real time, but for testing and higher customization, I want to have things functioning accurately at higher rates. I’m just uncertain why the DateTime structure can’t keep up with my custom clock setup.

In a nutshell, I’m totally curious about high accuracy timing and tick rates that are decoupled from frame rates. What options do we have in Blueprints, outside of DeltaSeconds off of the Event Tick node? I’m familiar with the GetWorldDeltaSeconds and GetAccurateRealTime nodes, but am uncertain if they are the route to go. Would utilizing a TimeSpan and getting milliseconds be a good option, and if so, how would I best add time to this to work well in a clock setup?

What I’m building is a clean and accurate clock with functions for calculating Julian days, solar time, so on and so forth, based on several papers for calculating the sun’s position. Basically, a day/night system. Before anyone points it out- I’m aware there are options on the Marketplace, have seen the great tutorials by Kleiner Bear on YouTube and have read over many other tutorials on this topic. I developed my own clock/timing system in C++ for a game engine I was working a couple years ago, and that beast had nanosecond accuracy. My main problem is getting down timers and tick rates, as it applies to just about everything else we do in the stuff we create! I want to understand the optimal techniques :confused:

Thanks for any insight!

Not familiar with the date and time functionality but in assuming youre losing accuracy converting from int to float with dropped decimals. This would be a source of error most obvious with a high time scale. Although to be fair, even floats wouldnt be immune from this sort of thing given a high enough time scale.

For framerate independent motion with timers you need to scale any movement by ‘get world delta seconds’, which is the same delta time that tick gives. ( the time the last frame took )

That is, this should work… I do have some troubles with the game running faster when packaged vs in the editor, making it hard to finetune speed values.

6ixpool - I had this in the back of my head and you are probably correct. I kept hoping I could avoid it, but in the end it looks like I’ll have to replicate the utility functions available to the DateTime structure, using floats. It’s gonna be a lot of work, but what can ya do? :stuck_out_tongue:

FrederickD - I’m aware of how to do things the classic way, with the default Event Tick. If I were to scale things in my system using the deltas from that, it would make things wonky in my tick setup, as it’s likely to be running at a different rate with different deltas… If I understand correctly, the Event Tick delta is still based on frame times. Consistent perhaps- but still based on frame rates, technically.

It is dependent on frame time of course, but it is expressed as a fraction of seconds. The idea being that you can express motion in m/s for example, based on real world motion, no matter the frame step. So, many frames/s gives smoother motion, but the relative speed stays consistent.

So, if you have a timer, you can still use that delta. I think it is more precise then a timer step. with small time steps i think it might not keep up all the time.

I should have made my understanding of utilizing deltas (not just time) more clear, so as to not waste anyone’s keyboard life, haha. The same thing happened over at the Facebook group- everyone explaining how to utilize delta time for movement and such. I’ve written my own game engine in C++, with nanosecond precise timing, physics with various integrators, basic AI, particle systems, etc… Alas, I totally brain-farted with my types (float → int), which indeed caused loss in accuracy after a bit. After moving to a totally float based setup, the issue I was having still arises though. I just have to keep the time rate multiplier within a reasonable level and it doesn’t happen. Until I find a proper solution, no fast-forwarding time at a rate of 1,000+ or more…

I still hold hope that someone will come along and make a light bulb click on!

high accuracy timing