I know UMG is experimental, so no hard feelings here at all.
Let me describe situation from my pov. Until recently it was very hard to process touch (and mouse) events, but i just discovered that some of widgets have mouse over event bindings (for eg. border). To my huge surprise things like button do not have it. So there is big inconsistency in what widget can process. Luckily i can create border as background for all other widgets. But then some widgets inhibit events, ie. nothing works below them, button for eg. i cannot track mouse over in area where i used button.
Now my 2 suggestions:
- make every widget behave in same way for touch and mouse events. Best would be if they all shared same parent functionality for all possible touch and mouse events.
- allow us to choose inhibit or not for events, this should be done in parent / child way. Something like list of possible events where we can check in or out for each separately, i think collision for physics objects has something like this.
- dummy (empty) widgets that are never rendered in game, but can be tinted (and visible only in umg designer) just for creating zones/areas for touch interface. Would be great if this could use simple shapes for marking them, like triangles circles n-gons.
um that was 3
and one more thing not related to touch in UMG:
respect rendering levels in umg widgets when i try to select them in designer. Ie. if i have huge background widget and small one on top of it with higher render value, designer should always pick that one with higher value. I think it currently picks first one (ie. oldest) in created widgets list.
Ps. generally its good job on UMG.