Lumen performance vs Raytracing

Did you also enabled Virtual Shadows? Because that kills the performance on my project. Lumen with Shadow Maps works fine performance wise. Virtual Shadows basically cut fps in half and produce a lot of hitches.

^ this. Whenever reporting FPS can people please list resolution?

A scene that runs 60 FPS can melt the same GPU at 4K.

Can Lumen work on a non RTX card?

If I remember correctly: it works on all dx11 gpu with some features requiring dx12.

It’s all here: Lumen Technical Details | Unreal Engine Documentation

i do not like anything that increases frametimes or input lag, for me that would not be a great improvement even though rtx and lumen etc may look nice. i played cyberpunk with rtx-on and that was good. i think latency and frametimes are topics that are underestimated because not many players realise all aspects that makes them like a game. often enough visuals and sound quality are realised more than other optimizations that contribute towards the feeling of a game in the same way.

Has this been fixed or is Lumen still more expensive?

It depends on your quality level. As of UE5.2, lumen runs way,way better than it did in EA.

Lumen can scale far lower and slightly higher than standalone RT, depending on the desired feature set. Even low-quality lumen GI will still be better than RTGI by and large, and RTGI doesn’t support emissives or many other features.

At lowest, lumen can use an irradiance field gather against the global distance field, and it will be very cheap, slightly more expensive than DFAO.

At highest, lumen can (experimentally), use HWRT to intersect with full-res nanite meshes, with multiple bounces of reflection, support for animated geometry, two-sided foliage, and translucency. It’ll be basically indistinguishable from the path-tracer in a majority of use cases.

They increased lumen quality, you probably have to tweek the settings to degrade if you want about the same quality and performances than before. Also hardware RT will NOT saves performances. It’s intended to increase quality and only for some renderings cases like reflections/translucency at the expense of performances.

2 Likes

Good luck using this engine then.
They decided 1080p was good enough back when they said that Kite worked and was performant.

3 questions for you, so you can draw your own conclusions:

How many years have passed since (gdc kite)?
How badly does Kite still run at 4k?
How much do you think the epic team cares?

Bonus quesrion:
Fortnite lost over 30% fps in the past 2 years on a 1080ti. How much do you think epic did to remedy this?

My 2c is.
They are never going to get their rendering act together. Go to Cryengine. Or anything else really. If you want performance.
If you are ok with sub-par rendering and lower performance, then epic is just perfect.

1 Like

The biggest problem is those do not come with Quixel, metahuman, or the insane amount of dedicated information UE has. None of those have blueprints either. NO engine compares to the amount of resources that come with UE5.

We cannot use our content because it is only licensed for UE5 products.

Me/my studio have always made our own content because of how unreliable unreal is… its literally easier to put money on your own assets than it is to rely on the licensing quixel/other stuff thats “free” or that unreal allowes for…

Realistically though, even if you are bound to unreal in many aspects, you can still try/test other engines / pay to move to them by purchasing appropriate licenses of whatever you use…