Announcement

Collapse
No announcement yet.

[Twitch] Support Stream - Unreal Motion Graphics (UMG) UI - Oct. 21, 2014

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #31
    Could you explain what was meant by the following quote from the UE 4.5 Release notes?
    DPI Scaling

    Unreal Motion Graphics now supports automatic scaling for resolution-independent UI. We now provide default DPI scaling rules for all games. You can configure this in your project’s Rendering settings.
    Is this related to the the following card (https://trello.com/c/D0SC1m1e) in the roadmap?
    Mac Editor Retina support
    This involves a lot of reworking assumptions in Slate.
    As far as I understand, there is no support for vectors/SVGs so I don't quite get how the button graphic is resolution-independent.
    Since the text comes from fonts which contain vectory glyphs, I could see how that could be resolution-independent. However, you could also just be scaling down the text textures there too, depends how it is implemented behind the scenes.

    Edit: Found this nugget from Michael Noland, text does seem to be resolution-independent.
    Slate actually handles something like this with a runtime atlas for font characters, rasterizing characters as needed at each scale size used in a frame. This works pretty well in practice, though quickly zooming in and out in a Blueprint can pollute the cache temporarily.
    For those users (gabrielefx, RPotter and anyone else) interested in vector support, please post a reply to this post Unreal Engine Vector Graphics Support so we can show Epic how many people are interested in this.
    Last edited by supermario6532; 10-21-2014, 12:59 PM.

    Comment


      #32
      I have a lot of User Widgets in my project, what about folders for User Widgets?

      Comment


        #33
        Very excited about this stream! I have a few questions:

        1. Are there plans to make it possible to build a settings menu that lets players customize controls in a blueprint-only project?
        2. I'm having trouble understanding the 1:1 correlation between the new Set Input Mode functions and the old Add to Viewport options and it seems there isn't any documentation or learning resources yet. When I use one of these functions on a widget, my mouse cursor is invisible and I'm stumped as to why. Can you explain how the new Set Input Mode functions are meant to be used?
        3. Are there plans to add a UMG example to the Content Examples project?
        4. Is there a UMG tutorial series coming to the YouTube channel any time soon?
        Join Unreal Slackers, the largest Unreal Engine communuty on Discord! → https://unrealslackers.org
        Have a question? Give me a shout on Twitter → https://twitter.com/heypfist

        Comment


          #34
          Originally posted by es View Post
          I have a lot of User Widgets in my project, what about folders for User Widgets?
          You'll be able to categorize them in 4.6

          Comment


            #35
            My questions about UMG:
            - What is the best way to interact with UMG from a C++ code?
            - Shall you support input actions and axis in UMG??
            - Do you plan to add a real data binding system with objects member variables?
            - Shall you also add a possibility to directly define a UMG HUD class for a level?
            - Will you update example projects like Shooter in order to implement the UI with UMG instead of Slate?

            Comment


              #36
              Thank you for answering my question about SVG support nick.

              I agree with everything you said about tessellation based vector graphic support. Supporting things like thin lines with a geometry based solution would be very hard without high levels of MSAA which isn't even possible with the current deferred renderer.

              From a performance standpoint, I think the best solution would be adding vector graphic support via CPU rasterisation.
              Certain image assets could be stored as SVG, then based on the user's current display resolution, the SVG would be parsed/rasterized on the CPU to the target resolution at run time using the open source, BSD-licensed AGG library (http://www.antigrain.com/) and uploaded to the GPU as a texture.

              Comment


                #37
                Was not able to watch the stream live, but I just like to throw a thank you to Nick Darnell and Matt Kuhlenschmidt.
                For answering the qustions in a great and detaild way

                You guys are great.
                Cheers!

                Comment


                  #38
                  where can we see the recorded stream?

                  Comment


                    #39
                    Originally posted by pinosh View Post
                    where can we see the recorded stream?
                    http://www.twitch.tv/unrealengine/b/580423201
                    Map Generator 1.0
                    Map Generator 2.0
                    Map Generator 3.0

                    Comment


                      #40
                      Thanks zeustiak.

                      Comment


                        #41
                        Here's the tutorial I said I would do a quick write up on in the stream on making widgets appear over actors in the world (but in screenspace).

                        https://forums.unrealengine.com/show...space&p=167853

                        Comment


                          #42
                          @NickDarnell Thanks for the stream, it's good to see these things happening on a semi-regular basis. You mentioned on the stream about elaborating on some of the longer questions in the thread (including mine) after the stream, so I was wondering if you were still planning to do that.

                          Comment


                            #43
                            Originally posted by RPotter
                            Why is binding on the widget class itself? It would be nice to be able to offer an alternate data source, in a more MVVM-style pattern. Furthermore, it would be nice to limit the actual evaluation of the data to only when it changes (or an event says to refresh it's local value), as the evaluation might be expensive. You can certainly approximate this by adding a gate and resetting it when changes occur, but the main reason I want this is to reduce the amount of logic a UI artist has to implement themselves, as well as keep as much logic out of the widget BP (that doesn't directly control what the widget is doing) as possible. Especially since these are binary assets, and right now a UI artist and engineer can't effectively work on the same widget at the same time (not without a lot of pain, and a lot of boilerplate hookup: create a variable that is the data source, and create a function for *every* data item that simply gets the data).
                            The binding is on the widget itself because it’s easy to understand. Where it goes from there is up to you. Several people have chosen to define the logic inside a UUserWidget subclass in C++, then they simply bind the logic to themselves after reparenting the widget, keeps the logic in C++ and away from the UI designer. Others have chosen to pass a Model object into the widget that they sample in their bindings. Both of those routes require a C++ developer because Blueprints do not permit constructing arbitrary UObjects yet, but that’s likely to change soon.

                            Future versions will probably allow bindings to be specified on a member object’s members to reduce having to make the extra step you mentioned.

                            Evaluating only when data changes is a difficult thing to automatically determine. There’s the KnockoutJS approach which takes all bindings and simulates running them to determine all values accessed to determine the tree of possible values that could change and invalidate the UI. While that method is super slick - we don’t have any system in UE4 for doing anything like that. It’s also not how Slate works. Slate has TAttributes for generally any value on a widget to allow you to bind it to a function to massage the data for the UI, and always provide the latest data. Or to simply pass it a literal value to prevent calling a delegate. The whole UE4 editor is written with that approach.

                            If you’d like to not have bindings run every frame it’s up to you, you can choose to use the Set______ functions instead to assign the value when you’ve detected it has changed. We may come up with something slicker in the future that allows more customization about how the bindings work. If it's expensive to evaluate, you could also build the cache into your View Model that you're binding to the UI.

                            Originally posted by RPotter
                            I'm also a fan of defining resources (dictionaries of values) that can be bound to directly, allowing easier style switching. Styles right now seem specific to a widget hierarchy, which makes them not nearly as useful. Why aren't the styles more modular? Why can't you bind elements to shared parameters?
                            Time. The plan is to build a modular system, in the mean time we’ll take steps to make things more reusable, like the named slot feature I mentioned in the stream.

                            Originally posted by RPotter
                            Less important, but still useful is the Visual State Manager, as seen in WPF. Being able to define states on a control (in a loose fashion), override those states, or add new ones on existing controls, and have the system auto generate transition animations is a *great* productivity booster. You utilize similar state machines in other areas of the engine, would it be possible to utilize that for this?
                            Maybe, but not any time soon.

                            Originally posted by RPotter
                            I'd also like to see vector support come in at some point. Although you can make items that *resize* using things like N-slices, they don't work for changing resolutions (the regions end up scaled down). Basically, every resolution you support needs its own asset, increasing the package size. It would be nice to have proper resolution-independent UI using vectors: if rasterizing them is considered a performance problem, why not just cache the results? Takes up the same memory the texture itself would need, and you can apply the cache to static sub-hierarchies (reducing total draw time overall). Selective retention is a great why to balance performance and visual quality.
                            Just caching the result only works good for static UIs. I think it will be a lot of work to find a nice balance of caching vs. live rendering vs. lerping between generated mips during animations. I also think you’d encounter a lot of problems finding a good way to not cache too much, large full screen backgrounds on UIs - you wouldn’t want to cache the entire surface area, you’d actually want to N-Slice the sections of the vector graphics so that you don’t have to cache large sections of just a flat color.

                            Originally posted by RPotter
                            I know you guys have WPF experience over there, so I assume you must have at least considered all of these features at some point. As such, I am curious as to why you opted against them.
                            Some of it’s philosophy differences, but a lot of it is time and resource constraints.

                            Comment


                              #44
                              Originally posted by NickDarnell View Post
                              Both of those routes require a C++ developer because Blueprints do not permit constructing arbitrary UObjects yet, but that’s likely to change soon.
                              i'm usually working with c++ and its not an issue but constructing UObjects from Blueprints would be very useful. can we have that asap.

                              Comment


                                #45
                                I just finished watching the archived stream and there is one subject I'd like to expand on that was briefly mentioned. Matt mentions one eventual performance improvement would be to automatically atlas textures. What about premade atlas textures? Most of our HUD elements are already atlas'd and delimited through FCanvasIcons defined in our AHUD actor. But I couldn't find a way to recreate the equivalent in UMG using either an asset type or a widget.
                                Last edited by cmartel; 10-27-2014, 02:25 PM.

                                Comment

                                Working...
                                X