We are trying to optimize our project for Oculus Quest but we are running into issues with both hadrware and software occlusion culling in our project. In theory, engine culls an object if its behind another object. Hardware Occlusion Culling is on by default and done automatically. This is working perfectly fine in editor. But its not working on runtime on Oculus Quest.
We use RenderDoc to see which meshes are rendered. Walls of the house are renderd almost at the end of the frame. So everthing behind the wall, that are not actually seen by the camera are rendered too. At first, we thougt that objects rendered first if they are closer to the camera and last if they are furthest. So we adjusted the pivot of the wall to be closer to camera than object behind it. But still the wall was rendered almost at the end of the frame.
We use Precomputed Visibility too. But that one is not working great either.
Then we turned on Software Occlusion Culling too (We set it up properly too. Selected which meshes to be used and which LOD to be used for SOC).
This didnt work either. This actually worked. Saved 5-10 fps on average. But it didnt cull everything properly.
Then we thought maybe bp actors (almost everthing you see in the frame are bp actor. Bc they are all interactable. Walls included) are rendered after static mesh actors etc. This was a success to some degree. So we put the same static mesh just in front of the walls. This culled the objects behind it in some cases but not all.
So what do you think is the problem here? What is the priority for rendering an object first?
Some project settings that might be related:
Support Software Occlusion Culling - True
Occlusion Culling - True - Other settings for culling at default
Early Z-pass - Decide auto
Clear scene - hardware clear
Instanced stereo - false
Mobile HDR - false
Mobile Multi view - true
Round Robin Occlusion Quesries - true