Game timing varies on different machines or devices

So this is sort of a “duh” question because I know timings vary across machines.

That being said…I’m writing a simple game where the core game loop revolves (see what I did there) around flipping a rocket. The timing of the rocket’s launch, rotation of the flip, and landing have to be consistent across all devices. I found that Unreal’s physics engine had too much variance between devices and little stability/consistency if any on mobile. So I opted for an old-school approach and manually coded the launch and flip using set/add actors with rates derived from DeltaTime. I only engaged the physics engine when needed, say for imprecise things like crashing the ship and such. At first, this seemed to be the way to go as it was more consistent but I’m starting to notice more deviations, not as bad as before but enough to mess up the core gameplay to a point where I’m almost done with Unreal for good – at least for this game.

I guess my main question is: is there a way to get consistent gameplay across devices with this engine?

I’ve written 2D game engines in the past and from my experience, the easiest way to smooth out gameplay between devices was to get the processor’s tick count and scale the game’s timing based on that – effectively speeding it up on slower machines and slowing it down on faster machines to get a consistent feel. Since Unreal was created by people waaaay smarter than me, and has presumably cross-platform capability, I was hoping that very common thing would have been handled. But the more I work with Unreal, the more I find myself scratching my head.

I mean…now I’m in ranting territory because I don’t know Unity – but out of sheer frustration with Unreal I installed it and wrote a quick and dirty port in an hour or two that seems to functionally behave as intended. And that time was spent mostly fumbling around to figure out the interface/environment. Now I’m not saying I’m good. Lord knows anyone whose read my posts would agree I suck at this. I’ve been working in Unreal since UDK and Unreal 4 for over 5 years and I feel lost all the time. But I use it because I’m stubborn. I’m familiar with it and I feel like the clunkiness is me and not the engine and maybe if I use it I’ll get better. I haven’t switched to Unity mainly because I’m unfamiliar with it.

The engine is cross-platform but physics is not deterministic accross different timesteps or devices, nor is anything else in the engine to be honest. UE4 doesn’t have a fixed physics simulation rate either, instead opting for a semi-fixed timestep. There are a lot of valid reasons for this.

No physics engine or game engine that I know of is garaunteed to be deterministic, though. Lockstep Determinism is kind of a thing of the past, or at least usually reserved for very bespoke engines that power RTS games or something - but you have to support that from the very beginning and it’s extremely difficult. Even different compilers and CPU’s can break determinism no matter how well thought out your engine is.

Generally speaking, if you give the physics engine (PhysX) the exact same input you will get the same output - but the likelihood that you are truly giving it the exact same input is practically zero.

The engine has it’s own systems for creating sort-of-deterministic movement accross platforms for networked games, like the character movement system for example - but again, it’s highly bespoke and it’s still not really deterministic because of all the other factors at play. The engines networking system is built around the Server-Client model, where the server is continously validating what the client does and therefore has authority over movement.

The system relies on having a single authority on the result of the simulation, rather than trying to depend on determinism. Most engines, especially off-the-shelf ones so to speak, do the same thing.

Right, I know the physics is flaky. Epic outright said this and has been implementing sub-stepping to smooth things out but it’s not viable on mobile, which is why I opted to not rely on the physics and instead manually move or rotate the pawn. But even with that, I’m having notable performance differences.

Regarding PhysX and the exact same values: are you saying that hitting let’s say “Spacebar” to apply an arbitrary “10.0f” to movement or force does not always apply that “10.0f”? I mean, how else would you do it? I didn’t tell it to apply “~10.0f” you know? Or is that just not feasible?