Lumen GI and Reflections feedback thread

maybe it was just a bug that I had it working at some point, just like I had GI from Subsurface at some point, but nowadays it doesnt work anymore. :sob:

2 Likes

I’m a little curious what you mean about using RT for higher quality reflections at this point. As I understand it, standalong RT reflections lack specular occlusion and extra energy from where diffuse lighting would be bouncing around, so from a radiometric perspective lumen reflections are more correct. As of right now, I believe that lumen reflections are more or less feature-complete compared to standalone RT reflections. I think I may have said this earlier on the channel, but it may be a question as to whether or not the ‘messier’ surface cache is more visually suited to your work than clean but light-leaking reflections.

And you are correct, lumen can scale down to lower-end hardware. When I say ‘barely’, I say that from the perspective of talking to developers working with it, who have described how hard it’s been to get lumen working in concert with nanite, VFX, AI, gameplay logic, etc. You can make anything happen if you’re willing to give it the frame time.

How do you mean, RT reflections and lumen fallback reflections?


This is probably quite a big deal in some production workflows, actually: via some sort of wizardry (I truly don’t know how they did this), 3D niagara fluid sims are now visible in the lumen scene. This means you won’t get strange lighting discontinuity when you don’t have screen-space information on a sim.

I would love for the devs to present on how they did this at some point: this is HWRT, so somehow they’re constructing a mesh surface out of a FLIP sim and building a BLAS with it every frame, fast enough for real-time (marching cubes)? Seriously impressive stuff.

forget what I said about standalone RT, wont work anymore anyway - had a bit of a brainfart there.

Idk why others have so much trouble with lumen and nanite - given how easy they are to work with once you understood Lumen and its reliance on nanite.

For me, Lumen has a very predictable Cost and behaviour that is very stable unless you “overdo” something, hence why I have it in a released game since almost a year (5.0 that is), and it runs fine on graphics cards from almost 10 years ago.
In fact, its so stable, that I didnt even care anymore about Lumens performance for the two upcoming games - it just works, with simple meshes and Megascans alike, and even Plants, Effects etc…

Idk, maybe my initial involvment with Lumen during EA1 and EA2 and all the issues I had to work around now come in handy and turned into an advantage that makes it almost “braindead” to work with lumen.

I also think I might be able to get (some) Translucency working with Nanite, by abusing the “foliage mode” of Nanite, but I have not yet gotten around to do some experiments.

Meh, if I find the time, I will collect what I know about the two (Lumen/Nanite) and put it into the dev-hub, so that its not spread across multiple topics and thousands of posts.

No, a clear mirror in a room will not look anywhere as good as the “deprecated” RT reflections.

And while 5.3 have this multi-bounce reflection now, it can look very noisy and I didn’t find some command that could increase lumen reflection samples to fix that either.

And the mixing of screenspace and lumen RT can often look so bad that I have to disable screen traces cause it’s messing with the reflections.

Sure if the reflections are not prominent then it can look just as good or better than RT. But it only takes one object to look like it’s broken because of reflections and the whole illusion of realism falls apart instantly.

If possible, they should be integrated into Lumen, since they are superior. (or at least have a lot more traces etc. in HWRT mode than SWRT)

And another benefit of “the old RT reflections”: you could run them on RT cores and Lumen in Software, in theory, therefore better utilizing the hardware. (it should also make this accessible to lower end gpus that dont have enough RT-cores to compute GI and Reflections at the same time.)

Its visually fine to have GI on Software, while Reflections massively benefit from proper Raytracing.

1 Like

Please, take a look:

Let’s see if it’s better than Epic’s solution.

I think that’s a fair point you offer, and I don’t doubt you on lumen being able to hit those numbers (I would love to see your game :). But the devs I’ve talked to are mostly AAA, and what I’ve come to understand is that early adoption technology (early being a relative term, as lumen was shown off in what, may of 2021), is that technology development moves at a glacial pace for AAA games. Game planning is often thought of in blocks of more than half a decade, from preproduction to the tail end of support. When you have everyone from interns who know nothing about rendering, up to veterans who’ve been doing their work since the Quake days, you have a vast swath of people who have to be brought up to date. Add that the numerous outsourcing studios modern AAA games depend on, and educating all of them about the ins and outs of lumen, and authoring their content and levels accordingly, suddenly becomes a heck of an ordeal.

Not to mention, I, like you, have been using UE5 since the EA1 days (remember how big that gulf between EA2 and 5.0 was?). We’ve seen lumen through pretty much its’ entire public-facing development cycle, and the accumulated tips, tricks, documentation, and CVars. Developers just diving into lumen may read the adverts on it being a ‘dynamic GI and reflections solution’, and assume that ‘it just works’; they’ll be surprised by the yellowing effect, the noise in reflections from .2-.4, the surface cache behavior with complex geometry, etc. They may give up on lumen because it’s not doing everything out of the box, rather than being a very powerful piece of tech that needs some know-how to use.

I am happy that Epic has written a lumen optimization guide for 5.2, but it does bear repeating that lumen, like blueprints or Niagara or any other UE system,requires time and effort to learn, and I could see people unfamiliar with it struggling to make use of it. If you’ve been doing it since day 1, I’m not surprised you’re able to get it to do what you want, is essentially what I’m trying to say.

Also, I would seriously love to learn how you managed to get translucency working with nanite, I am truly quite curious.

I’m unsure if the expense of mixing/matching tracing kernels would outweigh the potential gains, but if the testing bares out a perf improvement, I think the idea of mixing SWRT/HWRT could be very interesting. Coherence and data structure size would be very fascinating.

So, some sort of depth buffer-cubemap raymarching backpropagated to the scene?

I was wondering if something like this would be possible, given that you’re essentially rasterizing a scene, tracing against a coherent depth buffer, and then pressing it back to the scene geo, but I’d love to know the perf.

I agree, this is an “issue” since they will also have to change their workflow etc. in other departments (meshes need to be treated differently if you dont want to get in trouble with Mesh Distance Fields and therefore lumen, which then ties back into Nanite and Lumen etc.)

I dont expect any AAA game to use Lumen anytime soon, simply becausethe workflow is too different from what has been used in the past. (I was more thinking about “everyone” when writing what I wrote - except AAA, because those are slow moving entities that still use UE4 currently.

I mean, its not hidden away or so and you actually have seen it a few times in this thread already - has lumen, large moving geometry, high quality plants, high and low poly meshes, lots of instancing, translucency and glass, Niagara effects, Subsurface etc. and even a fallback mode for old hardware. (I couldnt do everything I wanted to, because EA1/2 were a mess, so I scaled things back and worked more around bugs.)

I know that its not perfect, its not a small game, but also not AAA^^ (Remember how Plants were a big issue back then, and sometimes still are, until I figured out that you could “fudge” the MDF etc. under a wrongly labeled option in the mesh? (its somewhere in this thread.))

But here you go, thats how it looked in the end with 5.0 and all its limitations - and for that, I think it looks impressive, even with the 5.0 “artifacts”.:



EDIT: Forum butchered the image quality >.< original was 1440p with TSR etc. so it was very sharp and crisp, not blurry.

PS: the warmer color sheme is not some Lumen bug etc, I actually made it like that - its more friendly than the sterile blue-tint many games have.

The System Requirements I gave you for Lumen are literally taken from the game, I didnt make them up. (In fact, it probably could run even on a 1060, if the game wasnt as wasteful with other assets like the plants or lots of glass etc.)

Thats probably one of the big downsides of lumen, it doesnt “just work” on its own, since it needs other features to work properly, simply because it relies on the MDFs and the Cards generated etc. (For example: Per Instance Custom Data doesnt work currently, for me it just doesnt appear in Lumen scene.)

I have not yet tested it, its currently just a thought that should work at least in limited fashion, since we clearly have Nanite for Trees etc. now, which also have some translucency in their materials. (In short: anything translucent that can be a masked material can also be a nanite mesh, going by that logic. (Idk if we need to sell it to the Engine as “Foliage” to work, but in theory it should be doable with limitations.)

2 Likes

That is VERY pretty, keep up the good work!

I could surely reimagine the hours spent mowing down the unwary masses on Ziba-tower with my trusty AEK + Kobra… :smiley:

1 Like

Sure Unigine will release a benchmark with it. Can’t wait!

Excuse me? Maybe in an empty scene?

My overclocked 3060 can’t native 1080p at 60fps in Fortnite (with lumen and Nanite)of all things, with no motion blur, post process, AA, Meduim Shadows, High(“60fps”) lumen, reflections, medium effect and draw distance?

Several issue with that. At 1080p you can’t use some crappy upscaler, you need a temporal AA method because of lumen, post process is important for some games. I can get the same settings in City Sample (In editor( not play in editor), half a mile by half a mile) and I’ll get the same fps in Fortnite.

That means we need micro optimizations in VSM, Lumen and Nanite calculations.

With the same settings, we get 49fps at 1080p on a 3060

Nanite cost around 8ms, SWRT Lumen cost around 4 if optimized. And VSM’s cost under 2ms but for some reason City Sample’s VSM cost is over 3ms when played in editor with all AI turning off? Wtf?

But my point is we need more performance increases by 22% in those categories to achieve native resolution (determined by the card target) at 60fps which should be the standard by this day in age.

I think you have missunderstood the performance numbers I had written down.

A 3060 will not reach 60 fps, because its potato tier graphics card, it will be somewhere between 40 and 60 depending on the game and scene.

But this is completely fine, the target in next-gen games for those cards isnt 60 anyway, they cant run any new game “maxed out” at 60 without the help of upscalers.

Here is how the cards stacked up 1 year ago, ignore Intel because they couldnt figure out their drivers, so stuff underperformed. (I know the numbers are roughly correct, because someone tried it on a 1060 on youtube, and the 6900 xt numbers are from myself, the rest is based on BL3 DX12 benchmarks, relatively scaled. But it depends on drivers, graphics card, cooling, overclock etc.)
Unbenannt
^^a 1080 Ti (the high end card from that generation) should reach 60 fps, going by my numbers and it being something 80% faster than a 1080 (41 x 1.8 = 73, minus some % because DX12)

But now comes the fun part… nvidia (and AMD to some extent) being absolute [insult] with performance stagnation on those low end cards:

Screenshot 2023-07-31 073652

Isnt it fun, how a 4060 is just another 3060 that draws less power because nvidia thought: hey, lets sell them the 4050 as 4060…

So, the reality is, that it will take another 2+ years until those “60” cards can run lumen with 60 fps on 1080p, and that assumes that nvidia isnt being a [insult goes here] again. (And thats something I would not bet on… ngreedia will ngreedia whenever it thinks it can get away with it. And AMD will casually price-adjust to nvidia, same will go for intel… we are screwed.)

In the meantime, the high end performance went through the roof and we are at ~2x the performance of a 1080 Ti, while nvidia didnt even release the 4090 Ti (the cooler exists though, its gigantic.).

TL:DR Forget about the low end cards, you got two (reasonable) choices:

1: Not use lumen (as most people do)
2: Use lumen, but also have the game work and look just fine if its disabled (Thats what I have chosen, but it isnt viable for every game.)


Hey! Is there any workaround for this issue other than disabling screentraces?
(I guess its) Screenspace bleeding into lumen reflections in both 5.2 and 5.3preview. Happens with hard&software lumen and no matter the settings. The testsscene is just a simple opaque mirror material and a translucent material on the sculpture.

5.3 and Epic’s own HDRIBackdrop plug-in is different but not better…

I appreciate you sharing the numbers you have on this here, and I’d also like to add something else: 60FPS is, fundamentally, a performance target. One of the golden rules in game development is that your performance metrics should be derived from your game design. IE, if you have a puzzle game that’s focused on really high-quality visuals and targeting next-gen machines, then 30FPS is totally acceptable, because the core of your gameplay doesn’t depend on 60FPS. However, if you’re architecting a super fast-paced shooter, maximizing FPS will directly affect the fluidity, and therefore the quality, of play.

What I’m basically saying is that the performance you get is a decision that should be made from a game architecture perspective, and not a tech perspective. If your game has tons of PCs, a giant map, complex AI, and more, then you’re probably going to want your visuals to not be super high-end. Game development is all about tradeoffs, and if @Yaeko’s game is single-player and puzzle-based, with no significant networking work needing to be done, I’m not surprised that it would be way less demanding overall than Fortnite with similar graphical features.

I’m pretty sure epic has some sort of SampleSceneColorAtHit command meant to specifically resolve that issue? Although translucency may break it alas.