There are good reasons why vsync is used. It’s not limited to mobile or specific to VR.
The thing with GearVR, it seems, is that the refresh rate of the Galaxy (or the GPU) isn’t faster. I think the reason why we don’t actually see tearing as explained in the video is because of timewarp (ATW) or other VR specific Oculus magic. I tried turning vsync off trough the console but on the device I didn’t see any benefit, since I assume it hits the limits as it is. I don’t have complete understanding of the whole ordeal myself, but this is how it seems to work.
It actually does. It takes 16 ms between frames since the display can’t display stuff faster. If the scene is empty then it simply hangs in there, waiting. If u r withing the requirements (draw calls, poly counts and such) instead of just waiting it uses most of this time to actually do stuff. When u start throwing at it more than it can handle in 16 ms, and u add to this the time it still needs for vsync is when the framerate starts to drop and we see stuttering and all that.
There’s one more thing. Displaying the info overlays actually takes time as well. When I use “stat SceneRendering” for example I get a 2-3ms overhead for this alone, and noticeable drop in FPS. So it could be easy to conclude that “stat FPS/UNIT” aren’t for free, as well.
I’m not an expert on the matter so anyone could jump in and correct me, but this is what I’ve picked up while digging for it this last month.