Hi. I’ve been working on a project the past couple of weeks that is pretty much an on-the-rail walkthrough of an area, with multiple characters. The scene includes lighting, sound, and animations.
The characters were created and edited using external 3D software, using animations with skeletons that were imported as well.
The movement of the characters and the player’s perspective are handled using respective animation sequences, with the ques being handled as a cinematic for both animation and sound.
As of now, normal speed in the viewport on my system when simulated is 16.67 ms at 60 FPS. 16.67 appears to be the speed that the animations play at normal speed.
However, setting a different FPS cap on the scene will make the animation play much more slowly, with the ms value being changed from 16.67. FPS drops seem to reduce the scene’s speed by about half (not exactly half, since there is a sound discrepancy when I override the scene with t.overridefps 30).
If I got 60 consistently, this would be a non-issue. However, when working on lower-end computers, and when testing the final product in VR (my main intention), the speed of the player’s movement, the animations of the characters, the various particle systems in the scene, etc. are all slowed down.
I’m looking for some solutions. I heard that “get delta seconds” can be a node used to solve this issue so that FPS doesn’t affect controller input or manual player movement, essentially making FPS and timing independent, but I haven’t been able to find references that show how I can implement this. What can I do that will apply this to a scene with a lot of animations such as this one? Thanks in advance.