I need to see where on the screen the user is holding down with their finger and if is within certain regions of the screen. I thought about using widgets and just making them invisible, but that seems like a bad idea since it can effect other UI or click-through stuff later on. Is there a good way of finding regions that can either adapt to screen size or something else entirely?
I’m new to working with touch controls so I kind of want to do it right. Here is a 30 second ms paint visual of what I’m saying if it isn’t that clear.
The Tick event and a few others in UMG contain a pin called Geometry, which lets you convert between coordinate systems. Touch coordinates are different from UMG coordinates, and I think screen coordinates are not necessarily the same as touch coordinates too!
So to overcome this problem I use that Geometry pin and the Local to Viewport node which takes a Vector2D coordinate pair, the geometry wire, and then outputs two options: Pixel coords or viewport coords.
I think to solve my problem (but not sure) I had to grab the Touch’s pixel coords and then feed those into a second Local to Viewport node to convert into UMG coords. Sorry I don’t have the blueprint handy but I was able to get a UMG button to follow my touch around the screen this way.
Geometry wire is also the best way to get the size of the viewport (it has a node for that) if you’re trying to match up the touch with it or the relative positioning of UMG elements on the canvas. Anything else I tried miscalculated things like where the edges and middle are.