UIX: My work toward a user-editable, Blueprint-compatible UI system

Over the past month we’ve been working on a new approach to UI in our game. We need to get our UI efforts underway now, and UE4 currently lacks an adequate system for doing 2D UI.

We’ve decided to build a new system we’re calling “UIX,” and we wanted to share this here to open a discussion with Epic and the community. Maybe this will be useful to some readers, and maybe this will help communicate to Epic what our team is really looking for in an integrated UI development system and help guide their efforts.

We’d like to share our requirements for the UI, our evaluation of the alternatives that led us to building UIX, and a description of how it works. This system wasn’t difficult to build, and this kind of functionality is something we fell should be part of UE4 in some form.


Our requirements are:

  • The UI system needs to be user-editable with a more-or-less WYSIWYG layout editing system, as well as allowing us to edit the layout in some sort of text-based format.
  • It needs to be fully integrated with Blueprint and allow us to do most of the work of setting up a UI in Blueprint and the layout editor, while also allowing us to do whatever we need to in C++.
  • It needs to give us full control to allow us to do fades and sound effects and color/position animations trivially from Blueprint, and also be extensible enough support more advanced UI animation effects (shimmers, glows) with a bit more work.

The alternatives we considered:

We looked at CoherentUI, and it seems good on a lot of levels and very feature-complete, but it requires you to build your UI in HTML/CSS/JavaScript. We’d far rather use Blueprint and don’t want to have to deal with bridging the divide between UE4’s Blueprint/C++ and CoherentUI’s HTML/CSS/JavaScript (also, JavaScript is horrible). We’re also concerned about performance and UI latency in the long run given the asynchronous way that CoherentUI is integrated with UE4.

We looked at Scaleform, but couldn’t find enough info on pricing or even the state that their UE4 integration is in right now. We’re also concerned about pricing and how well it will integrate with UE4.

We also looked at Slate, and implemented the first pass of our UI system with it, but we concluded it wouldn’t work for us in the long run, because:

  • Slate currently has no interactive UI editing tools.
  • Slate isn’t compatible with Blueprint, which is annoying since that’s where we want to do most of our UI work. Slate forces you to do everything in C++, so every change is a recompile.
  • Slate forces us to write all sorts of little callbacks to handle mouse and keyboard events and change a widget’s position at runtime, and this is often the least effective way to do things.

So Slate is out, and even if it gets nice interactive editing tools at some point, we can’t use them right now since Epic hasn’t announced anything about them yet.

Epic is also working on Unreal Motion Graphics, but we don’t yet have any specifics as to what this system will do or even if it will do anything like what our game will need.


How UIX works:

The UIX system currently supports:

  • Interactive in-game UI editing with the mouse and keyboard. You can mouse-over an element in-game to change its name, tweak its position with the mouse, change its sort order, or change the texture it uses. Yes, in-editor editing would be better, but we’re not willing to modify engine source files at this point.
  • Text files. All UIXFrames save out their layout data as “.UIX” files which are trivially editable by hand, as simple lists of elements with their names, sizes, positions, and texture indices.
  • Source control integration. UIX doesn’t yet automatically check files out for you the way I’d like, but it does respect the read-only status of the UIX files and won’t let you edit a frame unless the associated UIX file is checked out.
  • 2D rendering through Canvas. 3D UI elements aren’t supported yet.

Here are some screenshots of the UIX placeholder UI for our game so far:

http://imageshack.com/a/img845/3448/7hph.png

Here’s what the interactive editing mode looks like:

http://imageshack.com/a/img834/5483/gkdi.png

And here are two screenshots of the Blueprint graphs for two of these UI elements:

http://imageshack.com/a/img838/7486/5hh4.png

http://imageshack.com/a/img844/5919/v1p3.png

Finally, a .UIX layout file looks like this:

http://imageshack.com/a/img837/6205/ny3x.png

The core functionality of UIX relies on a C++ base class AUIXFrame. AUIXFrame holds an array of textures and a list of ‘elements’. An ‘element’ is like a widget: it draws a texture (based on an index into the texture array) inside a certain user-defined rectangle in 2D. It’s intended as a logical grouping of UI elements that are closely related and should show or hide together.

It exposes basic Blueprint events for responding to mouse enter/exit, mouse-down, mouse up-and-down, mouse click abandoned, etc, and functionality for making elements and frames fade in/out, show/hide, animate their colors and positions, change their texture indices, and so on.

To make your own frame, all you need to do is:

  • Extend a class from AUIXFrame in C++ and extend it with whatever C++ - specific functionality you need.
  • Create a new actor blueprint derived from that C++ class.
  • In that blueprint object, specify the textures that the UI elements in that frame will use.
  • Run the game, hit F9 to enter UIX editing mode, mouse-over the UI elements you want to modify, and stretch, move, rename, re-order, and re-texture them to your heart’s content. You can also add and delete UIX elements to the frame by hitting Insert and Delete on the keyboard. Hitting F9 automatically saves your changes to text-based “*.UIX” layout files.
  • In your Blueprint use the Blueprint events and functions provided by AUIXFrame or any custom ones you defined in your C++ subclass to define its behavior.
  • Add an instance of that Blueprint actor, either by dropping it into the world or by spawning it dynamically.

I’m tempted to post the source code here, but it requires extensive integration with the player controller and the HUD in order to draw and intercept mouse events properly. I’m willing to do it if there’s demand for it but there’s a ton of integration work involved, and of course I can’t make any guarantees about support or reliability.

There also some issues we’re currently working on, such as the fact that the *.UIX files don’t get packed when you package the game. Epic is looking into adding support that would allow us to do this, but right now, it’s not possible without modifying the engine source.

We’re also working on improvements to UI scaling for different resolutions. Right now, we’re simply directly scaling based on the client view rect vs. the client view rect when the frame was edited, but in the future, we’ll either make their positions and sizes relative to other UI elements (a bit like Slate does) or allow custom UI setups for different resolutions and interpolate between them as needed.

I’m also hoping to add support for mouse-wheel zooming in edit mode in the future to allow you to fine-tune the look of the UI, along with some positioning guides for when you need to move or resize an element relative to some other existing element(s).

I caught your post in my Scaleform thread and I must say that I am impressed.

Since my last post there I obtained a Scaleform for UE4 license and I have to say it was less than intuitive for the average user to install and use it and it seems to me, through discussions with them, that there is no plan to make that simpler. I suppose it is different if you have some engineers and coders on your team but I am limited to one and we cannot keep up with that kind of a workflow as new engine versions release etc.

So I was in the market for other options to try. I tried Coherent and it is ‘neat’ but still not what I am looking for.

I did pull the throttle back a bit though because the release of UMG may solve my issues and be all that I really need and we are ahead of schedule on development so I have a little time to burn so I would love to check this out. I think it would be great to show this workflow to the UMG team to see what a blueprint driven UI in practice is like for you… It could lead to more diversity in the toolset!

Great! Thanks, Mike … I hope the UMG guys will read my post.

I will try to post the source code sometime tomorrow – I need to clean it up a bit first (and remove some project-specific stuff) and also write up some notes on integrating it.

I am not a coder, but I know there are posts out there on how to create a plugin. That would allow you to do the integration properly and prevent a lot of user errors maybe. Just a thought!

If we were looking at someday building this into its own robust, standalone UI system, we might do that. We’re game developers, though, so our game has to come first, and we can’t dedicate the resources to do that. I’m really just posting it here to A) give Epic an idea what I’m looking for, and B) give fellow devs a nudge in the right direction.

Dear Mothership,

I have good news for you and your team!

**You don’t have to modify the engine to use all your game and ui classes in the Editor!
**

Even if you are only focused on your game, I can show you the steps to being able to edit your UI in-Editor

  1. extend the UnrealEdEngine class.

  2. create a new EdMode that will trigger from the UnrealEdEngine class when you select certain actors or do some other initiating procedure.

  3. in your EdMode you can access all of your existing UI classes, in the .cpp, because your EdMode will include your YourGame.h

  4. set the config file for your game to use your custom EdEngine class.

**Result: **if you wish you can now edit your UI in the Editor, using your easily constructed custom Editor class and edmode!

Entire code sample for this is here, in my plugin source code

Just replace all the PCH in the EdEngine and EdMode classes with YourGame.h and you have your entire game available to you.

The **key point **I am making is all you need in your game is a custom EdEngine and EdMode class and your game can use its entire class structure in the Editor.

No engine code needs to be modified and the code will work in every engine version, as long as it compiles within your game project properly.

**I did the exact process I describe here so I know it works **:slight_smile:

As to your UI project itself, it sounds and looks spectacular!

I created my own in-game UI solution as well, so I can easily appreciate the awesomeness of what you’re working on and have already accomplished! :slight_smile:

Rama

Thanks, Rama! This is helpful.

I am not sure we will ever find the time to actually do this – editing in-game is fine for now. But it’s good to know that there’s a way to do it without modifying the engine.

Thanks! We used the Canvas drawing stuff you posted as a starting point, and just added the widget grouping (‘elements’ inside ‘frames’), WYSIWYG editing, and .UIX file loading/saving on top of that.

Hi there,
We announced a lot of information on UMG in our twitch stream which you can see here. Twitch

Short version is that it will support everything you want and will also include 3D support.

We chose to use Slate as the backing for UMG because we already have a huge library of common widgets (such as robust list/tree support, editable text blocks, menu system, combo boxes, etc) which will be really hard to replicate in canvas without a lot of work and we didnt want to duplicate that effort on our side. Additionally Slate supports efficient batching to reduce draw overhead and canvas is lacking in that department. We’d be happy to hear your feedback on our plans for UMG. We are pretty open to sharing all the details so just ask.

Thanks, Matt! I will check that out tonight (we’re dealing with a very slow net connection at our office right now).

My main question for you is: what is the ETA on these various features? We’re going to need to iterate continuously on UI throughout our dev cycle, so I’m curious if you can give us a rough estimate for the different features (or at least a prioritization).

UMG is currently the highest priority for the tools team at Epic. It is going to take several months of work before it is ready though. Id probably have a better estimate at the end of the summer but I don’t have an ETA because we don’t have one internally either. Of course, the code for UMG will be available in in GitHub as we develop it. If you need a UI solution right now then your work(which looks cool btw) is probably the best option.

Definitely cool to see the community working on stuff like this. I suspect there will be many UI solutions available in our marketplace once it becomes open to user contributions. People will be able to pick which one works for them!

OK. Thanks, Matt! I will stick with what we have for now and decide at that point whether to switch to UMG, tweak my UIX system to work on top of UMG, or stick with UIX.

Looking forward to seeing it in action; we will definitely be one of your first testers for this.

Cool stuff Mothership. The work in progress version of UMG can always be run out of master with the -umg command line option. It enables the creation of the “Widget Blueprint” which is the key asset that the artist workflow revolves around.

Cheers,
Nick

Thanks, Nick! I will give that a try for sure.

-Paul Tozour

Any luck with this? I can’t seem to get results.

Yep – sort of.

Make sure you specify “-umg” as a command-line parameter. Then, in the Content Browser, go to New > User Interface > Widget Blueprint.

That will let you create a new WidgetBlueprint object. Double-click on it and you’ll get a specialized blueprint window that looks like this:

I had some luck dragging widget elements from the left side onto the area in the right, but I’m not quite sure how to use it yet, and it definitely doesn’t seem ready for prime-time yet as far as I can tell.

Epic guys, any suggestions for messing around with this?