Hello. I have a silly question about using gravity with UE4.
I tested a simple drop of a ball at 400 m. I then start to calculate the delta time at 200 m. just to get the full velocity at work. It landed on the ground at 6.8 seconds.
That’s a half a second off from the traditional formula:
t = √(2y/g)
t = √[2*(200 m)/(9.8 m/s[SUP]2[/SUP])]
t = √(40.8 s[SUP]2[/SUP])
t = 6.39 s
So I went to Physics =>Constants => Default Gravity Z and changed the number, say, to the Sun’s gravity. While a higher value makes the object catch up its maximal velocity more rapidly, it doesn’t change its overall speed: the ball still takes 6.8 seconds to reach the floor instead of 1.2.
t = √(2y/g)
t = √[2*(200 m)/(273.614 /273.614m/s[SUP]2[/SUP])]
t = √(1,46 s[SUP]2[/SUP])
t = 1.209 s
OK. So the Sun’s gravity may be a bit exaggerated, but the Moon’s gravity goes down the same way. I still get a constant speed of 6.8.
So I guess we could emulate the feeling of lower or higher gravity at very short distances but long distances are out ?
Apparently not! Just reading about it in Wikipedia and it seems very interesting. But it doesn’t seem to affect anything in the engine whether I put 0, 10, or 60000, the speed is still constant.
What is your frame rate? If you are going lower than 30fps the delta time passed to the physics engine will be capped which means it will take longer for the ball to drop since the simulation thinks it’s not moving forward in time as quickly as the rest of the game. Sub-stepping would fix this.
Does the ball have linear damping set? This would slow it down as it falls.
Are you setting the gravity in cm? i.e. on earth gravity in UE4 should be -981
The formula you’re using only works if the initial velocity is 0. In your case you’re allowing the ball to drop 200m before you start measuring which means the time it takes should be much shorter.
We can use your formula to figure out the total time 400m would take to fall and then subtract from it the time it takes to fall 200m
OK, I stand corrected on the intent of the formula. But if that’s the case, I still expect that if I place my ball at 200m, starting at 0 velocity, the drop would take around 6.39 s. But now, it’s 8.18 s. I understand the formula a little better now, but the result is less satisfactory.
FPS is a 87 (it’s PIE). Sub-stepping has been added nonetheless with same results. Linear damping is set to 0. Terminal velocity, as mentioned before has been toyed around from 0 to <put incredibly large number here> without any difference in the results. -981 for gravity isn’t a problem: it’s the default setting. And speaking of units, if 1UU == 1cm, then for 200m I should place the ball at 20000 units on the Z axis, right ? And yes, the floor’s point of contact with the collision sphere of the ball is set to 0.0 on the Z axis.
I see that the ball is using a projectile which has maximum velocity. When I turn the max velocity to be something really big, and set the initial velocity on it to be 0 the number I get is 6.38 which matches what I’d expect.
So in this case the projectile component is kind of acting like terminal velocity due to the max value.
It was copied from the first person template, and I had to “undo” so many features to get it right. I guess people like building games with Roadrunner physics.