I have recently started working with timers versus the default Event Tick and had some curiosities. I posted on the Unreal Engine 4 Developers group over at Facebook about this, but wanted to get a deeper insight here to completely understand this topic. My goal is to have a highly agile tick, using timers. For my basic example, I’m making a clock. Currently, it houses seconds, minutes, hours, days, months and years. I’d like to possibly get this down to milliseconds, but it’s not a “must-have”.
So here is my first curiosity… I’ve been seeing more and more that breaking away from Event Tick and using our own timers is good practice, which is why I have been moving over to that. I also want to achieve as much frame-rate independence as I possibly can. Someone replied to my post on Facebook, saying that timers can’t achieve a higher tick frequency than frame-rate, which is a bit disappointing if true. Is this true? If so, how would we go about getting an accurate and frame-rate independent tick? I’m okay working with a common 60Hz tick, but desire the option to tick faster.
My second curiosity is related to the built-in DateTime structure. I notice that all the variables are integer types. For the system I’m cooking up, I need to be able to add floating point values to the days, which works fine for my own variables/structure that’s using floats. However, I love the functions available from the built-in DateTime node, such as DaysInMonth, IsLeapYear, etc. As I add time to my own clock variables and to the DateTime structure, everything seems to work well- until I speed up the time rate (with a multiplier, not using time dilation). At a certain point the DateTime structure variables start to lag behind. I’d like to be able to fast-forward time up to 1,000x or more. I know most games tend to have 2-30x faster time rates than real time, but for testing and higher customization, I want to have things functioning accurately at higher rates. I’m just uncertain why the DateTime structure can’t keep up with my custom clock setup.
In a nutshell, I’m totally curious about high accuracy timing and tick rates that are decoupled from frame rates. What options do we have in Blueprints, outside of DeltaSeconds off of the Event Tick node? I’m familiar with the GetWorldDeltaSeconds and GetAccurateRealTime nodes, but am uncertain if they are the route to go. Would utilizing a TimeSpan and getting milliseconds be a good option, and if so, how would I best add time to this to work well in a clock setup?
What I’m building is a clean and accurate clock with functions for calculating Julian days, solar time, so on and so forth, based on several papers for calculating the sun’s position. Basically, a day/night system. Before anyone points it out- I’m aware there are options on the Marketplace, have seen the great tutorials by Kleiner Bear on YouTube and have read over many other tutorials on this topic. I developed my own clock/timing system in C++ for a game engine I was working a couple years ago, and that beast had nanosecond accuracy. My main problem is getting down timers and tick rates, as it applies to just about everything else we do in the stuff we create! I want to understand the optimal techniques
Thanks for any insight!