For anyone else coming here via google search, I’ll provide you with a much better breakdown.
Opening the console and typing ‘stat unit’ (without the quotes) gives you a really easy-to-read view of the cost of your rendering. It will show the total time in milliseconds (ms) it takes for each category: Frame, Game, Draw, GPU as well as giving you the number of Draws (draw calls) and total Primitives in the scene.
The simple equation to figure out what your target ms for your game is: Take 1000 (milliseconds in a second) and divide it by your target FPS. So 1000/60 = 16.6ms.
So you want your total MS for “Frame” to be less than 16.6. Frame means how long it takes to draw a single frame to the screen. Contrary to what a lot of people think, if your game is Single Player then you DO NOT need to target 60fps. Of course, it’s better, but don’t stress yourself if you’re getting closer to 30. It’s only if you see frequent dips below 30 that you need to be concerned.
The “Game” category is how long your code is taking to render. This is how you can get an idea of how long the CPU is taking to calculate things like blueprints or C++.
"GPU" is obvious.
"Draws" is the number of Draw calls for the scene (expect this to be high, even an empty level has 132 draws).
"Prims" is mostly a non-issue (again 4127 for an empty level) and just gives you an idea of your scene complexity. It’s not really something you need to worry about even when developing for mobile.
Once you’ve got a handle on the simplified breakdown in stat unit, you can then use “stat scenerendering” for the more robust breakdown.
Finally ‘stat gpu’ will give you a more detailed breakdown of all the different categories contributing to the GPU cost to allow you to target specific areas for optimization like Lights, Translucency, etc.