Due to Epic being on holiday till Jan 2nd, I thought I would post my problem on this site as well to increase the number of potential eyes on the problem
a bit of background as to what is going on:
As part of our game the movement of the PlayerCharacter is done in “bursts” in a given direction…as such we set up the following parameters for designers: Mass, PushAmount, PushTime. Using this, we “push” the PlayerCharater for PushTime seconds, for a force of Mass * PushAmount.
In order to do this, I just take the direction vector (normalized) and multiply if by Mass * PushAmount. I then set this in AddForce for the character in its EventTick. This goes on for the set PushTime (I use a boolean which is enabled / disabled by a timer set for PushTime which EventTick checks to see if it should AddForce that tick).
as debug info, I also output the distance traveled and max velocity over the distance traveled, and I have found what I think is a bug:
Depending on how I launch the game, the distance traveled and the max speed differ. For Example, when I launch the game in PIE, standalone, new Editor Window, I travel less and am slower than if I launched the game as Mobile Preview. Also, I have tested this on different machines; Depending on the machine, the distance traveled and speed varies greatly. This isn’t a matter of 1 or 2 units, but almost 50~100 units at a time.
I’ve also tried changing the AddForce back to AddMovementInput, and I still get similar inconsistencies (even on the same machine). basic, simple physics is being highly, highly variable…which seems just ridiculous.
For example, I set up a simple test where i set the velocity for 500, and set that every tick for 1 second (i used a timer delegate to ensure one second had passed and then set a “setVelocityThisTick” flag to false) and then let it run. I don’t zero the velocity at the end, so with friction and what not, I get a bit of travel due to momentum. Even with this minor travel, the distance traveled should always be nearly constant…just around 510 or 520 units or so…However, its not. in PIE, i might get 520, but when Launched, I get 530, etc.
I’ve also tried turning on substepping, which doesn’t seem to solve the problem either.
We are running UE 4.6.1 and this is currently all being done in Blueprints.
I have a hard time believing the PhysX engine is this sloppy…physics engines shouldn’t be this varying…in fact it should be the opposite. Either I have not done something correctly or there is a bug, but I’ve checked all possible parameters I can think of and I can’t see any flags that suggest “ensure PlayerCharacter’s physics calculation is consistent.” Or is there something I have to set in C++ that isn’t exposed in Blueprints?
Any assistance would be greatly appreciated!