Say I have a UMG Widget with a couple Borders exposed as variables. Assume I want to implement the “On Touch” events (moved, started, ended)… What’s the workflow for checking if the touch event is over my borders? I don’t really understand how the “My Geometry” and “In Touch Event” inputs work, or the return value for the output, when implementing the touch events. Parsing thru the context sensitivity I have mangled my way into getting the XY coordinate of the event, and then I have manually done hitbox math (i.e. i manually placed my border at this location, and know it’s size, and am doing <= and >= checks of the touch location) to get something working. I feel as though there’s a much better way to do this however, but even getting the position of a widget in blueprints is seemingly impossible - though setting all the variables is easy.
As an addendum, is there any way to get an event called literally the moment the user touches a finger to the screen? The only time I’m seeing any information updates (touch registering, touch/mouse positions updating to current location) are when either tapping (only updates the moment of releasing the finger and not when the tap starts) or moving a finger across the screen.
Without knowing the moment a finger touches the screen, it’s impossible to get any reliable “hold” interaction.