Firstly, you wouldn’t want RTX results going into a lightmap. They’re nice, but at the same time they’re not that good and are still just a general estimate of how real light would look. Their entire upsell is in the “realtime” part. So that means you have to run a better looking offline renderer on the GPU. Right now that means path tracing.
For a while I was working on a GPU accelerated path-traced lightmass replacement for UE. Not only was progress difficult, the results weren’t all that great unless you spent a week rendering your lighting. That product made it into Unity in a different form where I believe it’s still just a viewport renderer.
People underestimate just how much work is done when you bake the lighting for a level. It’s hundreds of thousands of times more work than simply raytracing a scene to a viewport, realtime or not. To get a good result from path tracing you either spend more time building the lightmap (even with GPU acceleration) than you ever world running lightmass, or you rely on a denoiser, something that has really only just appeared on the scene. And the denoiser is part of why RTX results aren’t worth pushing into an offline lightmap. In the next couple of graphics card generations realtime graphics like this are very quickly going to reach a point where the denoiser is both obvious and the results are not good enough. It is a temporary solution.
Hopefully RTX gives Epic enough tools to write a CUDA version of lightmass, but I’d wait and see what they say about that as it could easily go either way.