While that’s very flattering of you to say, TrueSky is still miles ahead of my system in various areas. I hope one day I can get this system to be something comparable to it, but for now, we are worlds apart.
The issue with the clouds and flying through them is that right now, their volume is generated via stacked slices. While this works when viewing them from the ground, the illusion unravels when you go up to fly through them, as looking directly horizontally through them allows you to see through “gaps” between the samples. I can provide a picture after I get off work this evening to serve as an example.
There are a couple of ways to potentially combat this (which I’ll be looking into post-release) such as what you mentioned, running a trace from above the camera to see if you’re in the clouds, and then turning up the intensity of a camera-local fog. I’m also looking into re-writing the stratus clouds altogether, either using a cube-volume utilizing volume texture raymarching, or by using a procedural mesh component generating slices on the fly. Both come with their technical and performance challenges, and would require me to rewrite a lot of the system, as the current cloud shading model is unlit, but influenced through the blueprint and dynamic material instances to simulate the effects of light-scattering, backlighting, and self shading, the intensities and falloffs of which are all customisable without having to break into the material and changing the shader model parameters and then recompiling.
I’ll be putting up a demo before it’s out on the marketplace, I just haven’t had time to set it up yet. It’ll be using the flying game template, and allow some customization options such as whether or not to simulate time of day, what cloud sampling quality to use, and whether or not to render the volumetric fog, so that everybody can see how their systems perform.