Possible non-determinism with ChaosWheeledVehicleMovementComponent ConstraintSuspsension

Hey team

Just looking for some support with a Chaos Vehicle issue, we’re in the process of evaluating Chaos Vehicles, and we’re suspicious that something inside the SuspensionConstraints is non-determinstic

We have some functional tests in editor which we’ve been using on our in-house vehicle for a while, they simply drive the car around at 60FPS, recording the positions that it ends up at, then change the frame rate to a variety of other game frame rates and drive it around with the same inputs.

This test passes for our in-house vehicle, but fails when running with Chaos Vehicle, notably the tests succeed when running with Chaos Vehicles if we disable constraint suspension using the console command “p.Vehicle.DisableConstraintSuspension 1”.

We begin the cars up in the air, asleep, the test fails the moment the car hits the ground.

Hoping you can shed some light - thanks!

Steps to Reproduce

  • Run a chaos vehicle with a fixed set of inputs
  • Run it again with the same fixed set of inputs at a different framerate
  • Vehicle output deviates

Hi Darcy,

What you have run into there by the sounds of it is called the ‘stiff spring problem’ - and is a very well known issue when dealing with all ‘springy’ objects in a simulation. The issue is caused by a circular dependency between a force which depends on a position (or any other time-varying variable). There are _some_ models which can use a different way of calculating the suspension (using frequency), but they are rare, and can only simulate simple setups before the maths falls over.

Generally we recommend you watch out for instability, or juddering when setting up the suspension, and alter the substepping amount in order get numerical stability.

I’m afraid I cannot answer why your car does not have the same issue - it could be taking a large number of substeps internally - but even in the highest end vehicle simulators I have been aware of, this is a thing which needs designing around or mitigating.

All the best

Geoff

Hi Darcy,

You’ve mentioned in the first paragraph that it is the same state/inputs, but later you say ‘using the frame delta’. In that case, the delta time itself is an input since there are differential equations (ie time dependant).

Can I check what sort of divergence you are getting to begin with, and secondly does this go away if you switch to a fixed timestep (using async physics) and repeat the drop test?

There is also the possibility of the constraint order not always being the same, but it makes sense to check the usual issues first.

A repro would be great, but let’s check the usual suspects first!

Best

Geoff

Ah, my apologies Darcy - I am following now.

Yes if you can provide a repro, I will take a look into what is changing.

Best

Geoff

Hi folks,

I’ve made a small bit of progress and that I’m fairly certain this is not to do with time or GT-PT synchronisation. It looks like all the inputs to those are identical, but they diverge during the solve.

It is strange that turning on the determinism isn’t fixing it, so I’m digging further to see whether the collision is diverging, and then if not digging into the solve itself.

Best

Geoff

Hi Gents,

I’ve just turned on wireframe in CVD, and this is what I have:

This is super, super complicated for a physics mesh - there could be all sorts going on with accuracy errors, and ordering in construction which can cause divergence. I’d recommend you simplify this down and create a very simple convex hull. There are also random unconnected bits of geometry as well. The wheels have brake calipers in the physics mesh as well and should really only need to be a cylinder for the phyiscs.

This is highly likely to be the cause of the issues we are seeing. If you can redo this geometry, make it super simple and retry the test it could solve the issue.

All the best

Geoff

Just want to provide a little extra context on a knock on effect of this not being deterministic which makes it very important to us

We’re making a mulltiplayer game, and state synchronization and replay-ability makes replication much easier to reason about and work with

While the amount of error here might be small, I don’t want to add tolerances to our tests, once you start adding tolerances, it’s hard to know whether the test is failing because of error accumulation from this, or if something else we’ve changed has introduced the error.

For example, we might add a behavior in the game which allows the car to boost, when we add a test for boosting, we need to verify that we’ve built that in a deterministic fashion, but if we’ve got tolerances, if the tolerance is too high, we might miss that we’ve implemented that non-deterministically, if the tolerance is too low, we might incorrectly report that the boost functionality is non-deterministic when in reality the issue was a different amount of non-determinism in the suspension due to the higher speed of the vehicle

Hi folks.

void FPBDSuspensionConstraints::RemoveConstraint is the issue here. The good news is that the runtime is deterministic from everything I can tell, but what is happening in this scenario is that the constraint ordering between the runs is not consistent. The new constraints for the new car get added first, and then the old constraints are deleted… using a ‘removeatswap’ algorithm - this effectively reorders the constraints at that point, which then introduces marginal floating point errors since the constraints are getting solved in a different order.

You can see this in action by putting in breakpoints in the FPBDSuspensionConstraints::AddConstraint and the RemoveConstraint and I found it easiest to watch the ‘SuspensionLocalOffset’ array getting reordered.

I’ll ping the dev folks and let them know

Best

Geoff

Hey Geoff

I don’t think this is a stiff spring problem, I’m not sure you’ve understood what I’m highlighting here, which is is that given the same state, and the same inputs, on the same hardware, the simulation gives different results, indicating that the order of operations, or state of the simulation is not consistent.

There’s a lot of factors that could cause this including:

  • Using frame delta instead of fixed delta
  • Reading state from game thread state instead of physics thread state
  • Updating components in non-deterministic order

The problem is not instability or juddering as the car is driven around, the problem we’re trying to solve is the simulation not being deterministic, not a human perceptible difference in behavior

This is why we’ve build the automated tests we have, so that we can catch when we’ve introduced something which breaks determinism in the simulation.

If it would help we can provide a sample project which just includes our functional test for debugging

Thanks!

Sorry, it seems I haven’t provided enough context, there’s a bit of confusion here.

Our game does run with fixed timestep already, that’s how our in-house vehicle, and the Chaos Vehicle (without Constraint Based Suspension) are able to run deterministically.

Our functional test feeds inputs (throttle, steering, brakes, etc.) into the vehicle for each simulation frame (async tick), and records the position of the vehicle on that tick.

It drops the car from the air, then drives it around for a while with a fixed set of inputs.

The test first runs at 60FPS, then we run the test again at 30FPS, and then 10FPS, every time, passing the same inputs to the vehicle in async physics tick, and comparing its position on that tick to the position it ended up at on the same tick from the first run.

This test exists to validate physics determinism of the vehicle, to ensure that regardless of render/game frame rate, the physics simulation gives the same result

When I talk about frame delta in my second post, that’s just an example of something which could cause the fixed timestep simulation to be non-deterministic, accidentally using the frame delta time rather than the fixed timestep.

One of our engineers is going to create a sample project for you which includes just the test, you’ll be able to run it with default settings, and the test will fail on the second (30fps) run, then if you set p.Vehicle.DisableConstraintSuspension 1, and run the test again, it will succeed, with the vehicle producing the exact same physics behavior every run, regardless of frame rate

Thanks

Hey Geoff,

There seems to be a problem with the site, I’m trying to upload the project at a zip file in a reply, but it gives me an error, says I can’t upload zip files (This document claims I should be able to) [Content removed]

We’ve created a project which contains a single test “Project.Functional Tests.VehicleTemplate.Tests.Map_VehicleDeterminismTests.Test_VehicleDeterminism_FrameRates”, this test works as outlined above

The project includes ChaosVehicles as a project plugin, because it contains a small modification which allows us to flush the bufferred inputs directly into the input state for the current physics frame, this is necessary for our test so that we can guarantee the inputs which will be used for each simulation tick.

When you run the test out of the box, it will fail, if you disable constraint suspension (p.Vehicle.DisableConstraintSuspension 1) it will succeed.

What’s the best way to provide the repro project to you given that I can’t add it as an attachment on this ticket?

This is the error I get trying to upload the zip file

Screenshot 2026-02-04 at 12.02.09 am.png(13.3 KB)

Hey Geoff,

Don’t think this is it unfortunately, the test we provided you is just using the out of the box mesh for chaos vehicles, in our project we actually don’t use a mesh collider, our vehicle mesh has no collision, we just use a box, so I don’t think it’s collision complexity, we get the same result event using a simple box

Hey Geoff

Thanks, this is great news! We actually found that the order of the suspension wasn’t deterministic yesterday during our workday, but hadn’t yet tracked down why, this helped us get a solution in which verifies that the sim is deterministic when these are in order

Please let us know when the team has a fix, we’re eager to cherry-pick (github) when it’s available.

Thanks!