I made a test in C++ where i save the largest value observed for deltaTime.
By dragging the window I can make it get value as huge as 0.4 =1/2.5!
I expected it to be close to 1/30.
I think this “trick” can break lots of code, or at least make it/them unstable.
So, is 0.4 some kind of hardcoded constant that the entire engine is designed to be able to deal with, or can we set it to 1/30 without rewriting the entire engine?
I think this is almost a bug or significant design flaw, as I think games should run at 30 or 60 fps to be considered realtime & acceptable; and not have to deal with “2.5 fps” scenarios due to someting unexpected happening
This is normal behavior, it’s the time between 2 frames, it could be way higher depending on your framerate, etc. If your game is lagging like hell, it makes sense that the delta time is high.
This “trick” like you call it is not a trick, it means your game is reacting badly to it. If you are in the Editor, there is an option to slow down the refresh rate of the Editor when out of focus. Dragging the window with your viewport around may make Unreal think it is out of focus.
While maybe there is an issue when moving window around, it is in no way a problem with how the engine is giving you the delta time.
In what condition are youo testing? In Editor, Standalone, published build?
If I reduce the engine scalability settings to low and run at a small window with play as standalone, I can get 200+ fps altough I only got i7 @ 2.2 GHz, 4GB RAM and GT630M
The problem as I see it is that if deltaTime is allowed to be passed through as 0.4 then I will need to take this into consideration when making my game
What I’m working on right now is having animations drive the gameplay, so as there is “Max Physics Delta Time” and “Substepping” for physics, then why is there not anything similar for the game logic?
There is, but not using the default Tick, which is logically called at every engine iteration depending on how much time it takes to do all the computations your game needs. Then there is a way to “cap” the framerate to not exceed a certain fps, or sync it to your monitor (vsync). If what you need is a logic that runs at a set interval, there is the timers in Unreal that will do just that.
I did a pool demo that computed time-of-impact for the collisions during a Game Physics course at a Swedish University, so a little, yes.
But for a character holding a melee weapon and spinning like a tornado at insane speed while also move at high speed along a curved path is probably not yet supported by PhysX as CCD
So modelling this for something running at 60 fps opposed to potentially drop to 2.5 fps, requires different approaches.
Michael,
I recommend you follow Rama’s example your provided forum post. Relying only on the animations for melee combat is dangerous because it relies on the frame rate being high enough for collisions to be detected within a single frame.
Your solution is fair, but even at higher framerates you could miss collisions if you create a character that moves fast enough. You need to perform a trace between the positions in the current frame. Take your solution (from post #14 in your link) and combine it with what Rama has done.
Additionally, my understanding is that animations are part of the physics simulation (Someone please correct me if I’m wrong). If you set max physics delta time and max substep delta time to low enough values, the game world will slow down if the framerate drops, meaning you’ll be less likely to miss those collisions to begin with. This is really a less optimal solution, though, because it messes with the overall speed of the game.
The problem with my solution is that it breaks down if the character spins more than 180 degrees in 1 step, or does a sword swing where the sword rotates greater than 180 degrees in 1 step moving along a curved path.
So at this point I’m wondering how fast my attacks can be.
Or of I need to precompute tons of stuff and use triangles representing the slash.