Thanks Thanks.
I just made a test on Quest2. use SetDisplayFrequency in Blueprint under the new UnrealEngine-oculus-4.27.0-v32.0. when set 60, ‘stat fps’ show 60 FPS. when set 72, 80, 90, 120, all show 72FPS.
Make sure you check your project settings for a maxFPS value
search ‘frame rate’
eg. this Material from PC, if want to use it in VR, how should i get a most similar effect…
1st of all, always use 1 Material Slot?
//
Tanslucent UI also cost much and should forbid to use in VR?
////
what’s the preferred setting values for ‘Build’ lightmap for Quest2? Take test and the smaller resolution the better?
Making that material
Should be fine. Looks like theres no displacement, no normal map and no translucency, so it should be fine as is.
Always use 1 material slot?
Nah use as many as you need. Objects can be assigned more than 1 material (by way of multiple material slots) so you have more control over how the object is displayed. For example, if you had a house mesh with windows, youll want a different material slot for the house and a different material slot for the windows. Check out the Paragon characters on the Epic Store and youll see how Epic used material slots.
Translucent UI
In VR? Like iron man’s HUD or something? I mean, its not going to be as performant as non-translucent UI, but should work. Most of these things youre asking are ‘use your best judgement’ scenarios that are going to depend on the demands of your project. As a rule, if theres something you want to do, just do it and see how it performs and if its not good, then think up ways to optimize.
Preferred build settings for Quest 2
I dont usually work for standalone Quest projects, but you should be fine with default settings as long as you have sensible lightmap resolutions set in your static objects. Higher resolution = more RAM required. There are profiling tools for this in the engine that should help
Nah use as many as you need. Objects can be assigned more than 1 material (by way of multiple material slots) so you have more control over how the object is displayed. For example, if you had a house mesh with windows, youll want a different material slot for the house and a different material slot for the windows. Check out the Paragon characters on the Epic Store and youll see how Epic used material slots.
I disagree on this one. Drawcalls are tight on standalone.
1 Mesh+ 1 material = 2 draw calls
1 Mesh+ 3 material = 6 draw calls
This is a super-oversimplified and inaccurate example, but it gets the point across.
Don’t use multiple materials if you can avoid them, but use them if you must. If that makes sense…
Thanks for the clarification! Id heard that multiple material slots affect performance but also that anything under like 10 is tolerable. But yeah, the rules are different for standalone Quest VR
these are just an example? or you mean in ForwardRender when 1 Mesh+ 1 material , 1st drawcall/pass is Z, 2nd drawcall/pass is lighting/shading?
Thanks for this expertise .
I’d like to add somes points and would love to have your feedback.
■ Drawcalls:
On Quest: not too much over 200 drawcalls & 400,000 triangles per frame
On Quest2: not too much over 250 drawcalls & 900.000 triangles per frame
■ Mobile multiview:
It is supposed to be desactivate for VR. But it must be enabled if you need to use a screencapture2D (in SceneColor capture source mode)
■ Forward shading/rendering:
It must be desactivated but beware that somes features become forbidden: DDX/DDY, WorldPositionBehindTranslucency , because of unreadable Scene Depth… (Can’t be packaged at the end.)
■ Mobile HDR:
It must be disabled. So no more PostProcess, but beware that you will not be able to use Decals too anymore.
■ Dynamic Foveaded Rendering must be activated.
■ Translucency:
- Don’t use translucent big poly in front of the camera or cut off all the unnecessary space. Small translucent or additive material seems usable.
- Do not mix transparent and opaque in the same mesh. (It renders twice?)
■ Lowpoly:
Flat shading multiply vertices in the engine compare to a smoothed lowpoly shading. In fact, everything different between each polys, splits vertices (UV islands, vertex weight, smoothing group…)
■ Do not use big or long unsubdivided polygon. (Or Do not use poly that are bigger than the tile chunk.)
■ PixelDensity command can be set to 0.85 to reduce the resolution an acceptable level.
■ About 1990 tips that helps for 2022 VR Dev:
- Work in Unlit if you can.
- Do not do heavy process on Event tick.
- Avoid destroying, prefer pooling.
- Audio must be 22khz and not too much as the same time.
Are you sure node PrecomputedAOMask is working on VR Quest?
I have a compilation error because of not working with ES 3.1.
Do you mean you are using Vulkan or else?
Or is it because my project is Unlit, forward Rendering, HDR off?
Hardware Occlusion Culling is the main performance killer on Quest 2 with UE5. Sad part is that you can’t disable it via cmd - you’d have to disable occlusion culling entirely via Project Settings (which also disables Precomputed Visibility Culling).
This might be okay for games that don’t use traditional level design, but in general it’s a disaster and Epic doesn’t care at all.
I’ve never tried, sorry. I mostly develop for PCVR, but PrecomputedAOMask is used with static lighting, so it should work, but again, not sure.
Hi! I have a problem in a VR room I’m working on (for MetaQuest 2), the thing is that simple meshes for the floor and walls, while being simple (5 triangles and a basic material with no textures), they are by far what takes more time to render, and I don’t understand why .
I’ve come to that conclusion too, to avoid big meshes with big triangles; but wouldn’t it be better for the GPU to use big triangles than smaller ones?
It seems that the screen is chunk in parts/tiles. And polygons that are on several chunks at same time, are hard to display for the engine. It’s better to have polygon that can be contained into each tile…
I’m sure somebody will explain that a bit better.
I didn’t know that, thanks! It’s definitely something to keep in mind while developing something for mobile VR
Kamesan speaks truth - Your GPU dices up the frame into ‘quads’ of pixels, and polygons that render over multiple quads makes it cry because it’s forced to render bigger quads and then discard pixels it doesn’t need. Rendering wasted pixels = bad optimization.
As a rule, geometry alone takes very little processing time to render (within reason) compared to materials, so for Quest 2 you should prioritize low-cost materials over low poly meshes. I believe there is a hard ceiling to how many polygons Quest 2 (and other mobile platforms) can render in a single frame, but I’m not sure what it is. I’ve deployed levels to Quest 2 with fairly complex CAD data and not had a problem with performance.