Looking for Feedback on Our Implementation Plan for our UI

Sorry if this is the wrong place. Started creating this thread in Content Creation but when I started talking C++ and rendering I thought it might make more sense here.

We’ve got a beautiful UI concept we’re currently trying to plan implementation for using UMG and we think we have a few ideas but we also wanted to ask the community to see if there’s something out there we haven’t considered.

This is the UI.

The skewed areas are expected to be widgets contained within the world to give them that 3D look. The problem we run into however is that 3D widgets are affected by post-process. The post-process pretty accurately recreates the blur and parallax effect seen on the UI but the colours are all washed out.

My current plan is to modify the WidgetComponent and then derive from it into a new custom component that would more easily allow us to specify what material to use when rendering the widget onto the quad. The material we’d write would use a Custom Depth Pass to draw the pixels for the widgets without applying any post-process to them.

A secondary idea is to render the UI using a separate camera onto a render target and translate any mouse interaction from the viewport into that render target.

We’re worried that we’re missing a more obvious solution though. We’re looking for any thoughts and/or suggestions that might put us on a different path.

4.13 will have some things things to help with this - chiefly is the Widget Interaction Component when it comes to interacting with widget component if you went down that path. With that avenue open to you, you could take mouse input and apply it to widgets in the world, negative aspects of that are that right now they still render before AA, so you’d probably want to switch to FxAA when on the main menu, rather than TxAA, as the ghosting effect to causes to UI generally isn’t desirable.

Another avenue involves using the retainer widget, you wouldn’t render it in the world, instead you’d simply render the UI using a retainer, and then apply a parabola distortion to get the desired bending effect. Negative bit here, is currently the retainer does not expose a delegate to allow you to transform the incoming mouse input into a new virtual space, so your parabola distortions wouldn’t hit test correctly. Solution would be to subclass SRetainerWidget, and update the GetBubblePathAndVirtualCursors and TranslateMouseCoordinateFor3DChild functions to be aware of a possible distortion and to make a new subclass of URetainerBox that creates your new SRetainer instead. You could get the material code from the engine material used in the widget component - same goes for the needed code to calculate the new mouse position, in Widget Component.

Glow is tricky, you’d either have to artificially blur some portion of the 0-1 range in your retainer shader, as we don’t yet have a HDR buffer in the Slate renderer, or you’d have to just fake it with a image overlaid on top of the text. This also holds true for the Widget Component - as it’s essentially a retainer widget, where the render target is rendered in the world, there’s another material (a standard world surface material) that is used for the widget component. As of 4.13, that can be overridden, you’ll need to duplicate a portion of the existing widget component when making an override, that does things like has a Parameter for the SlateTexture that is set on it, but other than that, you’re free to do any standard surface shader stuff, keeping in mind your input is still just a render target of the widgets.

Interested in this area also, as UDK triggers allowed devs to build in-game replicated UI / Menus in seconds…
Time to defer to Nick, but UDK allowed for secondary PP volumes of lower / higher priority iirc, same in UE4?

Not sure what you’re talking about, trigger actors are/were just for colliding an area and then triggering an event - that doesn’t have anything to do with making a UI. There’s is no support for DPGs (Depth Priority Groups) in UE4.

Sent a PM (IC widget + Multiplayer), but you probably get swamped :slight_smile:
Background to the UDK Trigger / Interactive Component comparison:

UDK Triggers could be used as a UI after attaching meshes etc…
They also replicated without additional work which was super helpful.
Here’s my question from the UE4jam-and-a-Special-Announcement Twitch:

Attaching UDK triggers to players / making them popup / disappear on a timer was also common.

I was just referring to Global Post Process Volume Priority…
So using mini-PP vols around the UI to override the master PP etc.