Recently I started playing with UMG. Seems like it has a lot of potential make some pretty good looking UI.
One thing I’ve run into though is getting the touch position within a widget. Basically I have and image that I want players to roll their fingers (or the mouse pointer) across and it 1) animates depending on the touch position and 2) provides game play code with the information about where the player is touching.
Animating was easy, but either getting the touch location or the widget screen coords so gameplay can work out where they are touching and drive the animation is proving difficult.
Two options:
How do I get a widget position in AHUD::DrawHUD(UCanvas*) coords?
How do I get the mouse pointer into widget coords (really I want a 0-1 result on each axis)?
I’m hoping the state of the art is further along than the big blueprint functions I’m seeing in my searches.
Regarding 1) it’s not the mouse position I need - as you point out that is trivial. It’s the widget position in screen space, OR the mouse position in widget space.
Regarding 2) Widgets may move, may not… right now I’m trying to work out if the coordinate mapping can be done easily.
My app UI is custom, and fairly simple. I have my own C++ concepts of widgets and my own image widgets, buttons, scrolling lists etc. UMG however is significantly more robust and exposed to blueprint, so I’m trying to learn the ropes.
In this case I have a car steering wheel and the user drags it around to interact. Would be trivial in my own system but I want to do this in UMG.
Led me in the direction of getting a result via UMG, though more complicated that seems necessary - as you have to convert positions and sizes using anchor points to get screen space information.
I’d like a function - GetFinalLocation( FVector2d &outMin, FVector2d &outMax ) or something similar.
These functions might be close (as seen in this thread), but when I printed out their results I didn’t understand that UMG was thinking of my Y value as being -ve from its anchor at the bottom of the screen.
Fraid I changed my design to avoid the problem. There were some large networks out there that people were saying might solve it so have a search and you may find the answer… Sorry not to be much use!
In Tick and a few other Events for UMG widgets, there is a Geometry parameter that comes with it.
Pull out that pin and you can create a Local to Viewport node.
This allows you to transform between UMG-space coordinates and Touch coordinates. It might work for Mouse coordinates too because I think maybe they are in the same coordinate space.
I had to use this node to create my own touch joystick, since I had to make it non-focusable and visibility=Hit Test Invisible and check for touch info from the Player Controller whether it’s touching the joystick area or not, based on coords rather than the joystick widget itself (is a UMG button just can’t directly click it).
I have this custom UMG touch joystick working now, and am excited to try it out on mobile.
The main trick was that I did Local To Viewport on the UMG position coordinates and used the Pixel space output so I could compare them properly.
It got tricky when testing whether the touch is inside the joystick bounds I decided on, because I had to take one to the coord space of the other, find the difference, then take that back into viewport coords, then add the difference to the original, THEN compare the length of the 2d vectors. Whew!