I’ve been going through a series of builds testing out various features needed for a project I’m working on and one of the more pressing issues I’m having is that SkyAtmosphere doesn’t immediately work in VR, leaving only the BP_Sky_Sphere in my skies. The directional light is working and set to act as an atmosphere light.
From what I can tell so far, the SkyAtmosphere will work only if it’s sampled and applied through a material to a sky dome mesh (I’m using the same mesh as BP_Sky_Sphere). Is this actually the case?
The BP_Sky_Sphere isn’t getting predictably sampled by the SkyAtmosphereViewLuminance (I guess it’s right in the name) which means that fill-tone which is essential for night time skies needs to be rebuilt from scratch through the material. It does look as if it is filling in a little bit, especially the default black ground of the SkyAtmosphere which is noticeably brighter. Still, in this screenshot the area outside my custom sphere is a much different tone.
I do have the Ultra Dynamic Sky package which seems to maintain control over all the necessary tones and gradients in a build.
To some extent I’m looking for some general explanation of SkyAtmosphere in the context of a VR/Quest application where fog is highly avoided but backgrounds are potentially very cheap and impactful. My instinct is that the sky data is either being computed with unacceptable data or displayed on a skipped pass, given that fog and depth don’t get on well with VR headsets. What’s happening? Why do I need a sphere?