Runtime lighting and performance significantly worse compared to editor

First screen is built in editor second one is all built in runtime including lights (meshes are also imported in runtime). For some reason runtime lighting and reflections are worse, they also take a greater hit on performance.
Furthermore runtime placed lights seem to have screen space emission, in this case I moved my camera past the “range” resulting in light not reaching camera anymore while that doesn’t happen in editor.

Any had similar issues and what may cause them?

Packed builds set the engine video quality settings to high or very high by default.

Do you have any ui setup to control the graphics level in-game? If so see at what setting it is on (it’s probably on High or even Epic)

Yes, none of those settings make any difference. The following screenshots were both taken at Epic shading quality.
Light emissions don’t escape out of screen space in editor while they do in our runtime example. The difference however is that we called these light actors in runtime and not placed in editor prior.

Could it be that the imported parts during runtime have different light attenuation settings? The more overlapping light attenuation you get the more costly the calculations.

Although if the runtime version is the lower pic it seems darker.
The lower picture seems to have lights in a different position, like it’s behind the door frame instead of inside of the room. Could it be a matter of lack of position offsets on import / creation with regards to the world center?

I’ll have to confirm the first question.
As for the second, all lights, positions and parameter values are identical in both. That was the purpose of the test. If I moved my camera slightly lower - where lights would be more towards the center of screen space.

Here’s the quick demonstration. First scene was placed in editor prior while second one is entirely placed in runtime. All materials and lights have same parameter values.

It’s either camera auto exposure or are you using the experimental screen space global illumination?
It would make sense that the scene gets darker without lighting information being present in screen space when you look up.

I just checked, all we’re using is Lumen as GI in engine rendering and in PP volume all settings are pretty much default Lumen except sky leaking but I assume that still shouldn’t make difference between editor and runtime scene since editor one is working completely fine.
The only Beta feature enabled is Virtual Shadow Maps that we’re using.

A thread on reddit seemed to tackle info on lumen and what could cause it to behave weirdly.

Seems some parts might be calculated in screen space.
It also seems to not play well with single large concave meshes.
Thin walls can also be problematic.

Have you tried a version in editor that would import the parts in pie / game mode?
Does it behave the same way then?

Could be that if the objects are static then calculations are cached.
Do your runtime imported meshes have mobility set to static or stationary once they are in the scene or are they set to movable?

Yes I’m aware about the big/complex mesh limitation in UE5, however it’s working fine in our case in editor so it shouldn’t be the case.
After further testing I figured it’s something to do with mesh importer. Pre-cached mesh that’s called in runtime along with runtime called light actors work exactly the same as the editor one, a crisp result. So we’ll take a look into that part.

Did you solve this issue? The difference between editor reflections and runtime is enormous - and I’m guessing has something to do with ray tracing since the runtime reflections look a lot like 5.0.3 mirror reflections under ray tracing.