How To: Game Development Best Practices for Quest 3 using Unreal Engine 5

Best Installation Path

  • Use latest Oculus Fork of Unreal Engine 5 (https://github.com/Oculus-VR/UnrealEngine)
    • Installation instructions: Unreal Engine 5.4.x for Meta Quest VR | Community tutorial
    • Contains the MetaXR & MetaXRPlatform plugins built-in
    • Many fixes specific to Meta Quest development that the official Epic build will not have
    • Includes many template VR projects to help you better understand working with Quest 3 in Unreal Engine
    • Extra Rendering Settings not available in the official Epic build, including:
      • “Support Mobile Application Space Warp”
      • “Support XR Soft Occulsions” (for mixed reality passthrough)
    • Will be around 250GB total after you Build the Engine in Visual Studio (compared to 150GB in official UE5.4 download)

Important!

In order to correctly test Quest 3 environment on your PC in VR Preview mode, you must do the following or else you will instead have a PC VR experience when in VR Preview mode:
* Click Settings button (Top-right corner in UE5) > Preview Platform > Android Vulkan Mobile.

Rendering Facts

  • The Quest is a PowerVR tiled renderer
    • Deferred rendering would perform too poorly on it
    • Forward rendering is the only option
  • Static baked lighting is the only viable lighting option for your scene. Fully dynamic lighting is not an option.
    • Do not use movable lighting unless you add a single movable directional light (read about shadows below).
    • Use stationary lights to light your character, interactable objects, NPCs, etc. and make sure they do not overlap each other
    • Never use a stationary or mobile rect light in your level since it is unsupported.
  • Keep draw calls as low as possible, 100 - 500
    • All static meshes should only use one material each
    • All interactive objects / skeletal meshes should also use only one material each
    • Under Project Settings > Rendering > Mobile, select “Enable GPUScene on Mobile” to improve the performance of your games by rendering the objects that share the same mesh and material multiple times, while sending them to the GPU in as little draw calls as possible
  • Keep max culled scene poly count under 100k - 250k
  • Use MSAA for anti-aliasing if you decide to use AA. FXAA & TAA will most definitely blur your scene, so do not use those methods
  • Watch the following videos to understand development on the Meta Quest better

Baked Lighting

  • Enabled “Mobile HDR” under Project Settings > Rendering > VR for better lighting (which enables Post Processing for Mobile and will take a small performance hit)
  • Add at least one Reflection Capture in your level (I use Sphere Reflection Capture)
  • Under “World Settings” tab, set “Indirect Lighting Quality” to 10 for best baked lighting results (which will slow down light baking but your light maps will be more accurate)

Real-time Shadows

  • Change the default shadow quality settings for Quest 3 by going to Tools > Platforms > Profiles, selecting Meta_Quest_3 and adding a Console Variable for the Rendering section, setting r.ShadowQuality to 2
  • Enable “Support Movable Directional Lights” found in Project Settings > Rendering > Mobile Shader Permutation Reduction section
  • Only use a single directional light for shadows in your level and set the directional light to movable
  • If you need something like a flashlight in your game, enable “Support Movable SpotlightShadows”

DO NOT:

  • Do not use “Stereo Foveation Level” or “Dynamic Foveation” in Project Settings > Rendering > VR. This currently crashes UE5.4.4 in VR Preview mode.

(I’ll be updating this topic as I continue working on my own project & understanding more about the nuances of Quest 3 & UE5)

4 Likes

Thank you for all these tips. I’m getting back into VR dev after a couple years off, so it’s nice to see not too much has changed. One challenge I’m running into is that the r.screenPercentage console command does not work in development or shipping builds, even when other console commands do work, like stats for debugging. How are we supposed to control the rendering screen percentage to tune performance and visual fidelity these days?

1 Like

I’ll look into screen percentage for VR. So far, I know there is a project setting for Default Screen Percentages, and one of the options is for “Screen Percentage Mode for VR”, which I have set to “Manual”

This is a great best practices list for sure. One thing I do have to mention which I have some doubts on is the “merge all actors into 1”. While it will significantly reduce draw calls etc, if done poorly and incorrectly it does not grant the opportunity to LOD any meshes at all or Cull them out. You will be rendering whichever amount of tri’s that merged mesh has become. So it highly depends on the game and meshes which I udnerstand is impossible to specifiy in a doc like this.

Imo Precomputed Visibility is quite vital to set up correctly, reducing all “Generate overlap events”, “occludes” on static meshes helps too. So only big walls and actual big blocking meshes should occlude and all smaller assets should not even be considered in that calculation.

1 Like

Thanks for starting this up. Reliable (up to date) information on this is getting harder and harder to find - to a degree i think because there is an element of “trade secrets” for the serious devs who have properly got to grips with this to not want to share too freely. The big enigma here for me is Nanite - even though not directly implemented on mobile platform, the Nanite architecture fallback system is performing optimisation steps which are not well documented in terms of how they crossover with the “traditional” VR best practices - whilst counterintuitive it seems that having nanite enabled in a mobile (EG quest) app may actually improve performance (even though it isnt actually using “Nanite” geometry) due to fallback optimisation. Id love some definitive insight into this…

Great tips but unfortunately it seems as though until we get some kind of occlusion culling solution back into UE5 its almost certainly not worth developing for the quest 3 standalone. Unless there’s a solution to this that I’m missing?

Actually, the UE5 Meta Fork includes the Adreno Occlusion Path, which leverages Adreno GPUs to improve occlusion culling on the Quest 3, this replaces Software Occlusion Culling i think

Performance Features UE4, UE5, Meta Plugin

1 Like

image

Occlusion culling has been available in UE5 since I started dabbling with the VR template 2 years ago… I haven’t tested to see if it actually works in mobile VR, but the setting is there.

This is great information. Thank you for bringing this to our awareness!

For Meta Quest development, you can enable Dynamic Resolution

Good advice, @Unearthlywhales. I was just repeating the advice from the Meta development video I shared along with this post, so I would agree with your statement. Occlusion is important for sure, and if we can combine efficient occlusion with low draw calls, that would make the best outcome.

yes im becoming increasingly confident that the Nanite fallback for mobile rendering is good at combining objects to reduce draw calls so combining the actors into one is redundant advice. I believe it also handles occlusion independently in some way (does it do its own precomputed occlusion… is the Oculus branch occlusion solution implemented if Nanite is enabled)