Measuring performance of particle effects


I’d like to measure the performance of an explosion.

How would I go about doing that? I could of course spawn the effect 100 times in an environment and measure
the FPS impact but is that really an accurate way to measure the performance of a particle effect?

And if I did that, I’d really need to know how to disable all the FPS locks and disturbances UE4 has in place. If I do t.maxfps 3000 and r.vsync 0 I still just get around 400.
Or is that the maximum FPS I can achieve in an empty scene with an i7 4700k and a GTX 770? Doesn’t seem like a lot. Are there other things I can do about that?

Would just like to hear your thoughts on this.


Nobody? Come on, guys! :slight_smile:

You could use the shader complexity visualization to gauge how expensive it is.
An alternative is to uncap the frame rate as much as possible(Seams like you have already done this) and then measure scene times with different counts of instances.
The profiler might help, But I have not used it.

An alternative is to download the performance tool that your GPU vendor supplise: NVIDIA=nSight, AMD=GPU Perf Studio, Intel=???


Thanks, that certainly helps. The shader complexity visualization isn’t really suitable for me in this case. Small differences in the complexity color wouldn’t be measuarable.

In which case you are going to need to load up a profiling tool to see what is going on under the hood (That could be the UE4 profiler or one from your GPU vendor. )