It’s the same source material as the default atmospheric fog from the engine, but i needed extra controls to hook the atmosphere to a procedurally generated planet.
There a couple issues left, but i’m getting close to completion, after much struggle i have to say!
To handle 3D textures, i ended up generating 2D tiles and using PseudoVolumeTexture from Common.ush, it’s silly because i have the actual 3D Textures reference but just couldn’t figure out an easy way to turn them into editor ‘volume texture’.
Slowly but surely moving forward with the implementation of the paper.
Started introducing multiple scatterring iterations. So far there’s two scatterring. It’s already possible to see the shadow of the planet in its atmosphere, looks really cool.
Sunset looks great too already.
If the value of this atmospheric fog is based on physics, does it mean that it can be used directly to create an accurate day night cycle(the correct skylight is produced by the atmospheric fog)?
This is the same atmospheric fog that the one already available in the engine when you drop an ‘atmospheric fog’ into the world.
Twist here is that i have lots of settings exposed to make it work on my underlying procedural planets.
Would be tricky to release as it is, i’m using a plugin to run 6+ compute and pixel shaders, a bunch of C++ and heavy custom HLSL nodes with 45+ input, it’s a mess.
There is no light produced as it’s a post-process effect, but you can have color information at any given point in the atmosphere. If you use a white directionnal light and use this color information over it you can achieve pretty convincing results but it gets complicated if you want to craft complex world on the surface.
edit - Regarding the color information available, i made a first test with volumetric clouds using it :
I’m curious as to what you are using for the planet sphere.
I have this render issue on world composition. Apparently the world doesn’t bend after a 5km radius from player point. This is causing a tower set 10km away from the player to be fully visible (on a flat surface) instead of dipping below the terrain as it realistically should.
Wondering if the sphere system you are using would be able to correct this…
I’m not bending a landscape, the planet is coded using quadtrees and a custom procedural mesh component, you can land anywhere on the planet.
It’s made of biomes blended together and mixed with some noise to hide any tiling, mostly inspired by the game Star Citizen and the various dev talks they’ve featured.
For instance in the following video i wanted to test runtime virtual texturing, i grabbed the Landscape demo from Unreal learning tab, turned it into a biom and mapped an entire planet with it. It does look a bit silly from sky and space, but gives a good insight on the creative control you can get on the ground.
One simple thing i need to do asap is adding a layer of bicubic interpolation when you’re close to the ground, and i’ll consider the whole planet foundation rather solid.
What i would like to do moving forward is adding a layer of editing with distance fields stamps (the planet ground already generate distance field, was a nice addition early 2020 ), maybe local volumetric editing with voxels, but all building on top of an extremely efficient quadtree based planet.
I get better MS when the material/instance / everything but Pie is closed… meaning that potentially, your inital 60fps can be better without the material open. Either way pretty impressive gain. Last time I tried VT (on kite demo) It ended up costing 20FPS more. Maybe it’s finally time to re-do that test as well.
I’ll have to look into quadtrees as well. Thoug surely I can come up with other solutions to bend the landscape at a distance by earth’s exact factor…
Also, after a brief look into quadtrees I guess I’ll be diving down that rabbit hole. Seems like it would be the perfect custom implementation to allow for near perfect frustum culling on my Ocean …
Oh the benefit of virtual texture is a given really, shader computation occuring only once and getting cached made the performance on my modest laptop (960m) jump by 40% i was extremely pleased and surprised.
I don’t know how Sea of thieves does it but if you come from space to ground you need some form of LOD solution even if it’s just until reach a certain level with a given tesselated water planet following you around.
Well yes, at least until unreal 5 apparently.
Though I wouldn’t use the new engine as an excuse to make shifty work claiming “the engine takes care of it”…
my current ocean is made with large sized instsnced hierarchical static mesh component that generates at runtime.
as any adaptive water material with displacement, and transparency, it is very expensive.
Coming in from space you wouldn’t need transparency until about 300m above. However to make it look nice you would need the shore color to shift into lighter/Sand hues. Doing that procedurally is taxing by nature…
having a proper shore with waves breaking would also destroy performance on a large scale “seen from space” shot… so I would need to system to go from an animated image to the shaped mesh as well… not the easiest combo of elements…