Performance cost of animations when Playback Rate=0


I’m setting up some Blueprints that make skeletal meshes play looping animations. Under certain conditions, the animations stop and start playing, but when they do, they speed up and down using a “Set Play Rate” command. I slowly change the rate from 0 to 1, and back again.

My question is: what’s the performance cost of a skeletal mesh that’s playing an animation when its Play Rate is 0? Can I get away with never issuing the “Stop” command?

I tried telling the animation to Stop when its Play Rate is 0, but when I tell it to Play again, it jumps to the first frame of the animation, which obviously looks terrible. And I don’t see a way to have my animation only stop on the last frame.

There are many of these objects placed in my level, so I’m wondering what the cost will be once there are hundreds of them. Does the animation system elegantly know how to reduce overhead when animations are at a rate of 0?

I did a simple test where I placed 196 of these objects in a map and watched frame times as I Played the animations, turned the rate up, turned it down, then Stopped the animations. As far as I can tell, the only thing that affected performance was Playing the animation. The effect was very hard to measure - but as far as I can tell, frame times went up a very tiny amount when the Rate was at 0. Issuing the Stop command didn’t seem to do anything.

If there’s no measurable effect in my test map, are there other considerations I should know about?