I have been a bit confused as to how the frame rate works in Unreal and not sure if I personally have a problem or if there is something I am not doing right…
I currently have framerate set at 30fps (Project Setting -> General Settings - > Fixed Frame Rate: 30.0). When in the general work view port and the play simulation viewport, my frame rate fluctuates between 30fps and 140fps. I am creating a high speed racer-ish type game so this makes it difficult to set speed/acceleration values because the frame rate is so inconsistent. The Frame rate usually sits at about 70-100fps but when I turn it drops down to 30-40fps. So this confuses me for a number of reasons
1.Why is frame rate sitting at 70-100fps in the first place when my frame rate is supposedly “fixed” at 30fps?
2.Why is it dipping so wildly (down to the frame rate it is actually supposed to be at the whole time) when turning?
While the level is fairly large and full there is no lights or matinee’s and most mesh’s are re-used many times to build environment. I do have complex collision on for almost all objects (I plan on optimizing collision later once level is done) so I suspect that could cause slow down. But what is the fixed frame rate for if it isn’t…uh…fixing the frame rate at said value or even anywhere near it? Or is the fixed frame rate related to ticks and actual frame speed related to graphics card or something?
But! for your acceleration/speed, it sounds like you’re not working with Delta Time. Delta Time is the amount of seconds passed since the last frame. This is used because frames will flux, it can be held down to a max of 30FPS so it never goes above that (as popular in consoles from what I understand), but it could drop to 15FPS or 8FPS due to performance issues, bottlenecking, etc.
Delta Time compensates for that by calculating how many milliseconds have passed between frames, because from frame 1 to 2 it could be say 0.005 seconds while from frame 2 to 3 it could be 0.5 (in an extreme case). So take your acceleration and multiply it by Delta time. (FApp::GetDeltaTime() if using C+ and outside of a Tick function, or Get World Delta Seconds if in Blueprints)
This is why if you ever look at the Tick() function, it’s only parameter is a a floating number called DeltaTime because a Tick is a frame, but it still needs compensated in seconds to be useful in many cases
Starting with 4.0 up to 4.9 I’ve found frame rates can vary when projects are played with in the edit environment so as a means of tracking performance is not very reliable on face value. The most common reason for “strange” FPS drops is when elements are added to the environment that are not yet compiled and are still CPU bound. A typical example is when you change the overall lighting and you get the “preview” stamped all over the place which indicates that you light maps are no longer valid and the shadowing is only a proxy of what your shadowing is going to look like.
Another effect as to performance, and an assumption on my part, is some assets added do not bind to the GPU until the environment is process at the production level which requires a much longer build time as the compile would have to sort each item in turn so this could explain why when you turn that the FPS would drop.
So yes I’ve seen wild FPS drops that suggest that there is something wrong that needs to be checked out right there and then but I think that’s just the trade off between the ability to iterate and edit at the cost of FPS but at the same time I’ve seen the FPS rebound so as far as development goes it’s probably best practice to wait until the project is nearing completion and save optimization as the final process before delivery.
Overall though basing any kind of decision as to what the problem would be doing so using the PIE is not really a good idea and to get a better idea as to what’s going on best practice would be to compile and package a run time map and during run time under true game conditions generate a profile to see what is actually going on as a deliverable. Since it is generated in real time then the profile play back should tell you just what is causing the FPS draw that is not a by product of an element behaving in the way it should wit hin editing conditions.
“When in the general work view port and the play simulation viewport,”
If you have a lot of editor windows open on a second screen like the Material editor or something else the fps can drop with 30-40 fps(or more) , so close everything before looking at the fps in the editor at play/simulate.
I have the exact same problem.
It started with 4.9 preview 1 got fixed in preview 3, and re appeared in the stable release.
Even in an empty level with 5 sprites and one character the frame rate fluctuates like crazy .
Needless to say that my game is nearly unplayable when from >160FPS dips to 20 every second or so.
Sometimes it remains stable for a few minutes only to start doing it again out of the blue.
I dont know what to do…
P.s I have the same issues on a compiled project. I dont think that this has anything to do with the editor.
As Roccino has stated, the issue isn’t so much that the frame rate drops, it is that it is far too high in the first place. So when it “drops” it is actually dipping down to the speed it is supposed to be at. 50fps seems really slow when coming down from 110 fps. So the fluctuation ratio in frame rate is about 100frames (between 30-130 for me). Is this a rare thing or is this how it is for everyone when in editor?
Is there a way to cap the frame rate at 30 or 60? What does the fixed frame rate do if not to cap the frame rate?