I have a challenge I’m struggling to solve.
I need to calculate how far the player can see (doesn’t need to be pixel perfect, but should be within a few hundred units) in exponential height fog.
Some things can be assumed: Fog near fade distance/start distance is zero. Fog height falloff is zero, so fog is uniform density in all directions.
I am interpolating fog density over time, and original I just tried eyeballing max distance at the start/end fog density and doing a lerp on the numbers I came up with, but density changes visibility exponentially so there’s some math here I’m not smart enough to wrap my head around.
Can anyone help with a formula, or some advice on what I need to google?