I’m not sure about about 5.4 & 5.5 since they where super broken, but 5.3 supports Variable Rate Shading on Masked Materials while 5.6 does not.
This is a major issue.
I’m not sure about about 5.4 & 5.5 since they where super broken, but 5.3 supports Variable Rate Shading on Masked Materials while 5.6 does not.
This is a major issue.
It will be removed in future versions, it’s obsolete (SSGI).
Feedback: Linux bug: This setting cannot be turned off in 5.5.4 and 5.6.. just keeps turning on after restart.
Editor Preferences > General > Appearance > Enable High DPI Support
The only working solution is the -nohighdpi
argument on the Unreal Editor launcher.
I see UE5.6 now includes the latest OpenXR 1.1.46 headers. I tried to integrate Logitech MX Ink stylus requiring “XR_LOGITECH_mx_ink_stylus_interaction” extension and “/interaction_profiles/logitech/mx_ink_stylus_logitech” interaction profile but when I hold it, the interaction profile for right/left hand is always null. Any clue ?
D3D11 SM5 or D3D12 SM5?
Just feedback: I DO like the preview-button added to the material-nodes; if there was any one feature we would want to add a shortcut for. THANK YOU!!
VLM is static and exist in persistent level, it doesn’t get streamed with sublevels. You should always check how VLM grid is positioned relative to level’s geometry, ideally all samples must be in empty or solid spaces, border cases may cause issues. Default grid size of 200 is mostly too much for small levels (indoor scenarios) so you should diminish it, also having grid too sparse gonna cause lights to bleed through walls due to interpolation. At the same time having it too dense would put the stress on VRAM as VLMs are preloaded there.
Pitch black probes may be the result of them being under the landscape (even if lanscape component is hidden or made invisible by ‘visibility’ tool) - there is the setting to disable this “optimization” in project config. Also if you use sublevels they may be loaded into Lightmass all at once causing incorrect shadows from ‘nonexistant’ objects.
First of, what would anyone get out of lying about blueprint performance, especially people who use blueprint
Second of, blueprint performance issues were solved with blueprint nativization, although blueprint nativization to C++ had its fair share of issues, removing feature entirely instead of fixing it was kinda sad, they didn’t remove subobjects and are instead fixing them, why not do same for blueprint nativ.
https://youtu.be/S2olUc9zcB8
Third of, Epic themselves are aware of blueprint performance overhead, you likely won’t notice it in lightweight systems such as inventory.
Issue is that there’s overhead per node and per switch from C++ to blueprint vm and forth, which is why having multiple bp actors with empty tick node can cause lag because of entry into blueprint vm land overhead despite tick doing nothing.
Epic bandaided this in recent unreal versions by checking if tick has any nodes and if not skipping call to tick in blueprint actor, to prevent people accidentally lagging project with empty ticks