We’re having issues with VSM not working for our packaged project on a specific machine.
On my own machine and another one that we have tried, VSM works fine with sharp shadows.
But on the error machine the shadows are all blurry. This machine is running NVIDIA Mosaic over two projectors. I have the specs written further down.
I can replicate the look on my own machine with r.Shadow.Virtual.Enable 0, which furthers my suspicion that something is wrong with VSM.
However, on the the faulty machine r.Shadow.Virtual.Enable still shows 1, so it has not been disabled in that way.
Specs:
NVIDIA RTX A5000
NVIDIA Mosaic over 2 projectors
Projector res 3200x2400
Final resolution 3200x4800
As you can see the resolution is much higher then a regular screen. Could their be a limit to VSM based on the high screen resolution?
Is there any good console commands to debug this?
If I had to guess, you’re probably on the right track here. It’s likely the high resolution and two screens is overloading your memory and causing Unreal to fallback to traditional shadow maps. A few debug commands you can try to test this theory:
r.Shadow.Virtual.Debug 1 to confirm the Shadow Maps are indeed not running r.ScreenPercentage 50 to see if lowering the screen resolution reenables VSM Scalability to see if any settings are being automatically downgraded by your hardware stat gpu to see if your gpu is running out of memory
If any of these raise flags, we can go from there!
Hi Sarah!
Thanks for your input and sorry for the late response, seems like I don’t get email notifications.
I have tried out the suggested commands but no luck. r.Shadow.Virtual.Debug 1 does not seem to exist. The closest I could find was r.Shadow.Virtual.DebugSkipMergePhysical but that does nothing what I can see.
This is my stat gpu, don’t really know what I should look for
Alright, so the fact that the debug command did not work means that indeed, the engine is turning off Virtual Shadow Maps. The scalability results are indeed alarming. 3 should be Epic scalability- aka virtual shadow maps- but the “custom” indicates that something is being forcibly overwritten- likely what is forcing the regular shadow maps.
Typing in “scalability 3” or “sg. ShadowQuality 3” doesn’t fix this?
If you have not manually adjusted any of your scalability settings (you can check this in the .ini file, or if there is custom blueprint/Cpp code to adjust scalability that would be the culprit) then the next steps I would suggest is just optimizing your project to try and force VSM to re-enable. I would temporarily switch TSR to TAA and turn off Lumen, then check if VSM re-enabled.
To sum it all up, something is forcing VSM to disable, though if it was through traditional methods I would expect the scalability settings to downgrade to 2 instead of reverting to a custom method. Make sure there is nowhere that your scalability settings are being overridden or being dynamically set, try and force it to full quality, and if nothing else works, force optimization methods to see if it will dynamically re-enable (you may have to run the command “scalability auto” to force the update). You can also add a command to your editor’s level blueprint to force scalability settings, though a warning, that might tank your performance when you send out the build. If you do this method, I would recommend adding in manual scalability overrides.