@Gavi101 thank you for your investigation!
regarding the occlusion and your workaround, you are correct about the buffer-mechanism. this is used to reduce pixel-resolution independently from the rest of the scene. this makes occlusion harder to realise since the scenedepth for occlusion lacks behind one frame.
i assume that your workaround leads the nebula to occlude objects behind the nebula, but not objects within or in front of the nebula. this is a also a plausible owaviour.
in general the occlusion here gets done with an exomential-density-function within the Mat_DisplayNebulaBuffer and gets controlled with the “EnvironmentOcclusion” parameter. You do set this parameter the NebulaRenderSettings MaterialInstance and it gets set at begin play automatically. to change the parameter during play, you will find it in the Material within the RenderSphere-Component of the Nebula_BP in the level.
the buffer has major impact on performance but minor impact on the visual appearance so it’s a difficult tradeoff i hope to kick off in future releases using GlobalShaders etc. You may use the rendering unbuffered with some small changes and use accurate occlusion instead of the performance benefits. you may use r.SetSeparateTranslucencyScreenPercentage instead but this also inceases e.g. the stars…
EDIT:
okay rethinking about your color-disappearing problem it surely sounds like the typicall behaviour when a fresh created nebula is baked into a persistant texture. at last step in the creation-pipeline, and if the corresponding checkbox is set, the baking is freezing the editor for some seconds to write the data onto disk. As an important act you have to open those baked textures and press Save to make them persistant. also the color will then return
OFF: here i describe some oversampling information since i thought the problem looks like that xD hf
"you told about the color disappearing which sounds like a different subject: oversampling. if the raymarching is taking too much steps within the volume, it cuts off the rendering to not not crash the display drivers and discards the color as a performance warning.
espacially when you create a nebula on your own, be carefull with the performance settings in the NebulaRenderSettings MaterialInstance!
also the shape of the nebula itself as impact on performance: many hard edges and light translucency gives the gpu more details to dislplay which yould cause oversampling.
As long as there is no bug occuring, use the different runtime and creation parameters to optimize your Nebula settings for your platform. as long as you don’t notice any undersampling slices you may decrease sampling for more performance. at least until those oversampling “warnings” are gone.
For super high performance you may look into the MatFunc_RaymarchNebulaDF and increase the maxSteps manually ;D