Are there any limitations? Is there anything you can do in UMG or Slate that you can’t do with your tool? Except for scripting the editor, of course. And the other way around?
Can you briefly explain into what the xaml files are translated, so that we can guess the limitations etc. ourselves. And whether this would be useful for specific projects
Hi, I’m another one of the Noesis developers. I’ll try to answer your questions.
Well, of course we’re biased and we think our solution is better. Of course, with UMG and Slate being developed by Epic themselves it’s always going to be a first class citizen and be more tightly integrated in the engine. Besides that, I think some of the strong points of Noesis, that give us an advantage, are (just to name a few):
We have an almost complete implementation of WPF, and improving upon it continuously. This means a rich set of controls, animation, styling, data binding and a nice declarative format in the form of XAML.
On the subject of XAML: Personally I find Slate’s declarative syntax to be quite ugly, plus you still need to recompile if you want to make changes. UMG lets you visually edit your UI, but it’s a binary format, which means problems when there are conflicts. XAML is text based, which means easy merging, and you can still use visual tools like Microsoft’s Blend to design and test your UI before importing it to Unreal. Or you can easily iterate on your design, moving from Blend to Unreal and back.
GPU accelerated vector graphics support to allow resolution independent, sharp UIs that scale effortlessly across a variety of devices and resolutions, and usually saving space at the same time. Though of course we support bitmap images if you want to use them.
As for XAML, it really is just a nicer syntax to allow you to remove what would otherwise be a lot of boilerplate code. But all you can do in XAML you can do in code. A simple example, the following XAML:
<Grid>
<TextBlock>Hello World</TextBlock>
</Grid>
Is equivalent to the following C++ code:
Grid* grid = new Grid();
TextBlock* textBlock = New TextBlock("Hello World");
grid->GetChildren()->Add(textBlock);
We care deeply about performance. However, doing an fair comparison is hard. It would involve creating the same interface using all methods, which is time consuming, and may not even be possible given the different feature sets. However, I can tell you that many HTML renderers use web browser renderers, such as WebKit, that were not designed with the realtime needs of games in mind. I’ll try to allocate some time to grab some performance numbers from some of our Unreal demos.
As it was pointed out before, our product is free for indie developers. You can just grab the SDK from our website and give it a shot yourself.
I have one more question though, and I think this one is important if we are looking to use your kit in specific projects: how does the xaml code effectively cause the game to render the UI?
Does it create C++ code to mimic the xaml interface in Slate? If yes, that would mean we could technically do anything we can do in slate as well, which is good news.
Depending on the answer, I (and probably many other people) might or might not consider using it. It’s just a matter of whether this kit might or might not bite us down the road later or not.
PS: it is possible to use both, slate/umg and your kit at the same time, right? are they even able to work together?
We don’t use Slate. We just render directly to the RHI device. This is the only way to render everything correctly, think about vectorial rendering for example.
Right now, you can render from slate/umg to Noesis but not the other way around. We are working on it.
So far my experience has been great. My UI isn’t working in UE4 yet but support is excellent and XAML was easy to pick up again. There are only really two pain points - having to rewrite (custom) controls in C++, and the way you’re meant to keep your xaml project integrated in the content folder.
Our middleware is designed to be integrated into game engines. As such, it declares some interfaces for the engine integrators to implement, to handle things like rendering and input. The UE4 plugin is built on top of this interface, and provides an implementation for those that uses Unreal’s abstractions for handling rendering and input. It also offers some Unreal specific functionality, like the ability to bind data and commands to Blueprint variables and functions.
The main runtime class is called NoesisInstance, which are generated by our Blueprint derived class NoesisView. It derives from UserWidget, which means it can be used anywhere an UserWidget can. It is created using the same CreateWidget function, added to the viewport the same way, it can even be used with the WidgetComponent. It’s also the way we handle input, which means that you also use the same functions for that as with UMG (SetInputModeUIOnly, etc.). The only caveat is that if you create a regular UMG Blueprint, our NoesisViews don’t show up in the same place as the UMG Blueprints because the editor explicitly looks for Blueprints derived from WidgetBlueprint, and ours isn’t. I’m sure we could work around this if it’s a deal breaker, either by changing the base class of our NoesisView, or by creating another Widget derived class that offers the same functionality.
As for rendering, we don’t create UMG widgets. In fact, as I said, from UMG’s point of view, a whole NoesisInstance is a single UMG Widget. We render through the Widget Paint interface, though. We have a CustomDrawer that is executed on the render thread, and we interact directly with Unreal’s RHI.
You mean having to rewrite them from C#? I’m afraid that’s impossible to get around. Having to keep the XAMLs inside the Content folder is something we could potentially get around. It should be possible to do it now if no absolute URIs are used, but we could also add the option to set the root path on the project plugin settings. I decided against this option because I’ve used it in the past and it’s got its own issues (people forgetting to set it, and inevitably at some point someone wants different roots for different files, at which point the whole thing becomes a mess). I went with this route because it felt natural given the way Unreal automatically imports assets in that folder and places the UASSET in the same folder. But I’m open to changing it if it’s too disruptive for people’s workflows.
How does packing work then? We have to install the SDK and the plugin, but what when we try to package our project? Will the user still have to install the SDK?