There indeed is not easy way to make simple button with touch interface.
Like one that registers press, or when you drag touch outside button area, or when you drag finger in etc.
There is even more, if you have multiple fingers, and when you drag one out and drag other in, nothing in umg can register this.
For buttons we ended with pooling all touch locations then calculating if one is in button area.
Only thing we are lacking is some easy way to store information about what area of screen has which function. This kind of requires storing touch map as texture (so it is easy to see zones in editor, or display overlay to test touch events), then pulling id from pixel color at touch coordinates. But that can be done only in C++. So instead we just calculated location (and touch zone ID) with vector math.
UMG for touch interface is useless. It can be used only to display hud, and even then sometimes it inhibits touch if you forget to change default properties of widgets.