I’m spawning a widget in world space and then animating the game camera to display it full screen.
I set my WidgetComponent draw size to match the size of the viewport, scale the component down to an appropriate size in world space and place the camera at the precise distance/alignment to fill the screen.
My widget was authored at 1920x1080 (i.e. DPI Scale = 1.0) and I’m currently running the game in a 1280x720 window (DPI Scale = 0.66).
If I just ‘Add to Viewport’ then the widget scales as I would expect, but does not do so when drawn as described above - despite the render target having the same resolution as the viewport. It appears that DPI scaling is not applied to WidgetComponent rendering so I am unable to achieve the result I’m looking for. Is there any way to control this?
Edit: I’m able to workaround this by using UUserInterfaceSettings::GetDPIScaleBasedOnSize to get the DPI for a given size. I use a BlueprintFunctionLibrary to wrap the c++ function and expose to blueprints (an example using the function is here. In case the link breaks in the future, the important part is this bit GetDefault<UUserInterfaceSettings>(UUserInterfaceSettings::StaticClass())->GetDPIScaleBasedOnSize(FIntPoint(X,Y))).
Then I create a base class like Widget_MenuBase which has nothing in it except a function SetDPI which accepts a float (the given DPI) as input. I derive all my widgets from this class and override SetDPI to make any DPI-related adjustments to the derived widget.
Finally, in the blueprint which contains the WidgetComponent that I need to make DPI-aware, I grab the widget’s RenderTarget size using GetCurrentDrawSize, feed the size into my UUserInterfaceSettings::GetDPIScaleBasedOnSize wrapper, and use the returned DPI to call SetDPI.
It’s still annoying to have to manually multiply values by a DPI scaling factor in the derived widget(s), so I’m hoping there’s something we’re overlooking that just makes it work the way we’re expecting.