What is your average 'ms' cost per level using the GPU visualizer?

Hi, trying to optimize this aspect for my levels and I was wondering what everyone’s ms costs are at and how well they are running? Currently using 4.10.4, my levels range from 5-11 ms while try to figure out a working solution to reduce ambient occlusion. Would love to know what you guys get.

Also have been using these resources to reduce costs as much as I can,

GPU performance tuning of “HZB SetupMips” (As seen in GPU Visualizer)…s-seen-in.html

The Vanishing of Milliseconds…s-dfe7572d9856

How to scale down and not get caught…scale-down.pdf

4.10 is very, very old (2 years!) and was before Epic spent a lot of time with optimizing many different rendering things.

Definitely, we’ve updated the engine version for our game until 4.10.4 but anything beyond that has given us a lot of problems with converting code, with unfortunate timing we’ve released a game in early access using this engine version and can’t afford the time to troubleshoot another version switch at the moment.

Is 6-11 ms per level sound optimal to you though? I realize the objective is to get the number as low as possible but we may have to live with some shortcomings on both ends right now.

I’m not sure what you mean with “per level”? Anything below 16 ms is good for a regular game and anything below 11 ms is good for a VR game.

Thanks for those guidelines, may I ask where you found that information?

what I mean by per level is a particular level in level streaming gives me an average of around 5-11 ms. I usually don’t have most of them loaded and they serve as separate levels in my game.

Well its just the refresh rate that most monitors can do. 16 ms frame time means you have 60 fps, and 60 fps is usually what you want to have in a game. 11 ms frame time is 90 fps, thats what you need to have in a VR game.

Yes 4.11 have significantly reduced the render times as it was one of the major relese of performance improvements. Undoubtly the upgrade would be the best, but by the lack of this option you can reside to micro optimizations by reducing the content’s density, eg show less amount of grass and cut the non-essential content (gameplay wise) in general. You can also provide options for the player to adjust these settings in order to reach better rendering on their ends. Many studios release their games capped at 30fps, which i personally do not enjoy so i wont recommend you to do that. While your measurements of timing sounds good, it doesn’t mean that player computers will perform the same levels. There are significant differences between a 1080ti and a 960 gtx just to mention one example. Just provide a few important menu options in the graphics to help scaling your game’s content, it will allow players to keep up the higher frame rates with their smaller configs.

By the look of your game i believe you must be using baked lights, which is good for the performance, but any dynamic light in the scene will generate significant hit on performance. You should go easy with the dynamic light setups that should help a lot, and bake out anything you just can to reduce the costs.

You can do some actual play tests on smaller configs as well to adjust your game better according the expected hardware configs. Tip:

I think its fortunate that 1060 is getting more widely used, that’s good news regarding your game’s framerates on an average player’s computer.

The “The Vanishing of Milliseconds” was a particularly good article i recommend reading it twice at least :slight_smile: Tho, there is more to it…e/Performance/ , and even more on the forums you can find piece by piece!

I’d seriously reconsider upgrading to at least the 4.11.

Thank you guys for the wonderful advice and I’ve learned a lot, I will see what I can do to improve performance based on what you guys have said and I’ll keep trying to make 4.11 work. Thank you.