Controlling objects with click drag on a tablet

Cool, I did not know that widgets know this only during OnTick and OnPaint.

However how do i break geometry structure, i tried everything, it does not have usual break struct node.

First/oldest and simplest reason why we are building our own touch interface:
We could not build simple fire button with multitouch interface support. Starting fire was not a problem. However big problem was to detect when touch left fire button area.
For eg. when user dragged finger out of screen, event to stop fire never happened. Then there was that logic for testing each finger everywhere we needed to just know if anything is pressing that button.
Everything together got very (unnecessary) complex for simple arcade game. So we thought that with direct (raw) input from device we could do far less complex touch interface. We have drag horizontally and vertically gestures, we may need pinch to zoom, and that is all. Also slate is nice for complicated games like rpg, for simple arcade shooter is mostly overkill.

We still use UMG for visual feedback, however it loves to crash. Looks like its most unstable part of unreal.

Maybe biggest culprit here is lack of tutorials and documentation for umg and touch interface. For us it was faster to develop our own touch interface than go blindly trough all those undocumented functions. And it seems that nobody but you Epic guys can answer those umg questions here. Or this is too much explaining for most that know to bother.

There is quite big limitation for us, we can use only blueprints, because lack of apple PC we cannot compile C++ for ios.

PS. thanks for pointers.