How to fix Lighting Corruption on Meta Quest 3

This one is just weird.
Its static/baked lighting we are using, but you can get it to display in real time in the preview using different settings.
In certain circumstances, lights (all light types, but specifically using point lights for testing) do not illuminate as they should, instead they ‘draw darkness’
I don’t know how else to describe it.
It is not a shadow, that would be caused by an absence of light due to something blocking the path of the light.
This is a darkness emanating from one side of the light itself.


I include some screenshots to help illustrate the point.
The ones with the darkness are snapped directly from a quest 3 headset, and the others for comparison showing what we expect are from the Android Vulkan preview in the editor (which displays no darkness as long as the decal DBuffer thing is disabled.)

This does not affect standard PC rendering, but you can see it in Android Vulkan preview (sometimes).
If I enable Engine - Rendering >> Mobile >> Mobile DBuffer Decals I get the effect in preview.
However, I am also getting it (albeit in different locations/angles from the light) on the packaged Quest app, when Mobile Dbuffer Decals is disabled.
If you move the lights, the darkness they cast moves with the light in the same way that, well, the light should move with the light. The darkness is always cast at the same angle from the light, relative to the world
Rotate the light, and it has no effect, the darkness is in the same place.
Rotate the static mesh room the light is in for example, and the darkness will of course fall on different parts of it.

In case anyone stumbles across this with a similar problem, I will post how I resolved it.

Basically, it was imported textures from FAB (Quixel ones) which for some reason had HDR compression.
This was making a mockery of any materials they were used in.

Selecting the offending textures and setting their compression to default/normal/masks as appropriate to their usage, immediately fixed the problem.

A good question might be:
‘Why on earth were the blasted things set to HDR in the first place?’
Does HDR even make sense for a normal map, or masks?
At this point, I’m kinda like ‘Whatever…’
At least we can get back to the job of completing this project for the client; so much time wasted on these left-field diversions!

I remember having difficulties importing 16-bit PNG files I originally made for another engine, can’t remember if it was the same issue though. I think Unreal has issues with some formats.

High bit depth source textures make sense when exporting to a generic asset that could be used by multiple engines with arbitrary in-engine texture formats (as long as the importer handles them correctly). For example, if a shader expects an 8-bit linear texture and you only have an 8-bit sRGB texture, the conversion would cause banding.

Valve uses 8-bit hemi-octahedron encoding for normal maps in their VR games in order to increase the precision for the same texture size. If you fed in a regular 8-bit texture into their importer, it would have nowhere to pull that extra precision from.