Both DX11 and SM5 for lumen were deprecated, a while ago I think- I don’t think they’ll be coming back; although frankly, your perf numbers are something remarkable.
It was working in 5.4 and as I showed here SM5 lumen support is mentioned in the release notes and the official docs. Nanite was deprecated to SM6 due to Virtual Shadow maps. Lumen has been working in DX11 as of 5.4.4
Nah we have it running on APUs and really cheap non HWRT pixel streaming machines from 5.1 - 5.4
its literally the way to get performant GI in UE5 (GTX 1070 my guy … not even RTX)
I track source changes and I’m almost for certain the Fortnite and UEFN DX12 requirement likely got pushed into UE5.5 main and it was an unintended side affect.
Well, the more you know I suppose. I knew academically that handmade LODs and the older render path could improve lumen throughput, but I’m still really impressed by your numbers. Even with how much better 5.5 runs, it’s still significantly behind what you’ve achieved. I feel like I know someone who would be very interested in this behavior.
Yeah its completely swept under the floor which is a failure to market and communicate true optimization paths in my opinion… Even today in UE5.5 look at this.
That’s a total loss of about 123% of the initial starting ms
I feel this is a failure to capitalize on an area where so many ppl who have asked and been yelling that UE5 is too heavy could have simply been told (fall back to dx11 as older hardware drivers and chips are better suited for dx11 and sm5)
It all honesty it was our “secret weapon” for running really cheap pixel streaming instances with Lumen SWRT GI without the need for HWRT capable machines… this change really hurts our business model. Direct impact on operating cost. With the starting overhead budget we could squeeze 4 users per instance in with no issues… We will be lucky to hit 60 fps with lumen now with 2 users per instance unless we go g5 and thats pretty sad.
This affects me too! I have had the same problem with poor performance and video memory issues when using DX12 in my project, and I saw the difference immediately when migrating my project to 5.5, since I’m using Lumen to illuminate everything, including interiors - using skylight leak to brighten room interiors. My game looks AWFUL now! Luckily I backed my project up before migrating it. I’m hoping this is an oversight, as OP said that continued support for Lumen under DX11 is in the update notes, so I guess I’ll give it some time before deciding whether to roll back to 5.4, or start trying to optimise for DX12 in 5.5. Luckily my game is a passion project with no time limit on it. I imagine actual devs with this same issue must be fuming right now…
It’s funny- most of the people I know working with lumen have taken DX12 and SM6 as a given- I don’t think a lot of people saw that older path as technically optimal, although that was largely because workflow needs superceded performance. It’s a massive bummer that the regressions are so substantial, although I’m confused as to exactly why you need to upgrade from 5.4 to 5.5?
Because eventually third party APIs and frameworks update and older versions become abandoned. Look at openvr vs openxr, Android support for ue4.27, Google play, webrtc standandards etc. while we “can” just stop on 5.1 as it’s currently our prod version it becomes a liability to keep an application running on older engine versions. I’ve been tracking pixel streaming memory leaks across 3 minor versions now looking to move to either 5.4 or 5.5 but it does appear we will have to wait or overcome other issues as it looks like pak chunking isn’t working in 5.5 like it did in 5.4 either and that’s primarily how we build our our room loader for our cloud app.