How to Convert UMG Widgets from Local to Absolute Position - Touch Input

Hi Guys,

This appears to have been asked a few times. The resulting hit or miss responses—unfortunately—appear to have created less than desirable documentation.

Is there any way straightforward method to take the local position of a UMG Widget (Button, Image etc) ,defined in the designer, and convert that to the Absolute position on the screen that it renders to?

@Nawrot it appears you have pursued this in the past, so I am wondering if you could shed some light on this subject.

I’m currently working on creating a mobile game with various gesture controls, and I am not a UI designer/programmer by far (the team is stretched a little thin, so I offered to help out on the UI portion). The controls and gestures are currently working, but the problem comes from getting these to trigger based on the image UI elements that were created if the touch event overlaps with them.

To my chagrin, it appears that the InputTouch Event that conveniently comes with Start, Move, and Release components is overruled by any visible UMG elements: buttons, images etc. If these elements are visible, the Touch Input will not activate. This kind of throws a wrench in our gesture commands.

With that in mind, I have started down the path of manually checking if the OnTouch, OnMove, or OnRelease overlap with any of these UMG areas, but I am seemingly losing the battle of converting the Relative/Local position of these items into the same location as the Input Touch event.

Posting some of the posts which I have thus far referenced. Any and all help would be greatly appreciated!

Edited: Added additional reference.

Further testing of setting Anchor point to Top Left of Viewport and button’s location out to zero, and then trying:

Button → Get Paint Space Geometry → Get Local Top Left → Local to Viewport (Pixel Position or Viewport Position) returns unexpected coordinates. Expected 0,0 since everything was anchored and positioned to Top left.

Pixel Position: 8.002, 8.008
Viewport Position: 16.761, 16.761

Further test with using either Paint Space Geometry, Cached Geometry, or Slot Canvas Get Position each yielding unexpected results :\

That was one of things i could not do with umg.

I was making arcade game and at some point i decided to use umg only to display visuals, and do whole touch input by myself. Code was not complicated, just 2 sliders (turn and accelerate) then 2 buttons to fire.

And fire button was when i finally snapped and decided to not use touch input trough umg. I could not solve few issues, like when player is sliding finger outside button area, and it keeps firing, and few more that i forgot.

I rescaled both touch area and screen resolution to same fixed size (1920x1080). Then i made some simple code that checked if touch is in area for each box.

Yes for game engine this is totally silly to not have easy to use game friendly touch controls.

I also tried to color code areas on some texture to read pixel color instead of calculating all those boxes.
But reading texture pixels was a bit complicated.

ps.
Thinking about it now (i know more about blueprints).
I would make general blueprintable component that detects if touch is in its given box (or circle)
Then it would calculate local position (to itself) for touch
and then call some event dispatcher in its owning player controller.
Why blueprintable components? You can just drop them (one for each area to track) without mollifying code, so you can redesign gui number of butttons etc without coding.

Yes i know all that is workaround.

ps. I see you have same problems as me with touch input over umg.

pps. for touch over umg widgets, there is that visible property, and was something else to change its mode how it reacts to input, however i forgot its name.

This is pretty much the route I have gone down.

Everything in isolation works as expected. It has just been crazy trying to figure out when and where a widget’s relative position gets translated to an absolute pixel position to then be drawn.

The math to determine the interaction is already set up and ready to go; however, I would rather determine that viewport pixel location via wherever the engine decides to draw these images. This would allow our artists to go in and do the layout via the UMG and avoid further headaches down the road. It’s this last bit that has been kicking my rear.

Hmmmm. From reading your PPS, I think maybe going the fake 2D route is going to be the way to go? I was hoping to keep it in UMG since it would be “glued” to the screen, but I guess I can just keep a plane or circle perpendicular to the camera and control is visibility as desired.

Anything passed that is a simple trace, and it would already exist in the same coordinate system as the built in Input Touch.

I think you simply have to handle touch as you would any joypad/gamepad/hotas input.

You override the objects default behavior and you choose what to do yourself.
Then you either consume or pass on the input for other things to consume (like the axis movement for instance).

OnInputTouchBegin is probably what you want to start your override for.
I would do this on a custom Image Widget, then use that Widgets with a modifiable image to make all of the items instantly have the same overridden functionality.

As far as conversions of measurments, you can always do the math yourself if you know a few things.

  1. screen size.
  2. percentage position of Widget item.
  3. size of Widget (in px or percentage).

Then you can do 1st grade math via Equivalence, knowing that 100% matched the current width.
(I really don’t see how any of that actually help, but it is what you asked?)

I did not do touch input in ages, so i am not sure if this is possible:

Create your own custom widgets, do umg only with two layers:

  • backgroud layer, some very basic container, border etc.
  • second layer where all custom widgets will be placed

Each custom widget knows its anchors (you probably need to set them up from parent)
and it knows its own size, you also can give it resolution of screen
so widget can determine its own coordinates on screen. This is doable if you do not have multiple nested widgets with different aliments etc. I think this is also reason why there is no function to get screen coordinates.

for recalculating position on screen and touch input, there is aspect ratio and sometimes it was different between screen/widgets resolution and touch input resolution.

Ya, the problem we have been running into is not the math that you outlined, but instead the issue of that we cannot get the data that we need to correctly do the math.

We are not using any of the built in event functionality for either UMG Images or Buttons as these events appear to be intercepting touch events before the actual Input Touch event that is built into the PlayerController. InputTouch offers data for multitouch and tracking out of the box, and we already have that whole system working.

The problem for us appears to stem from the visibility settings within each UMG Widget’s brush, so we decided to scratch that functionality UMG and only use it to set our UI images and then supply those locations to do said math that you outlined. The problem comes from getting the locations of the images we are using.

This leads to the crux of the problem as there does not seem to be an accurate way of getting the UMG locations to match that of the touch input. There always seems to be some unaccountable offset, and while hard coding the resolution would work, it would partially defeat the purpose of having UMG take care of scaling across devices and aspect ratios.

Both of your responses have given me some ideas and potential work arounds, and hopefully I’ll be able to respond with a solution that works for our current needs.

Are you placing things “dynamically” within the UMG?
If you are not, then the values you use are a certain percentage value of the resolution which you can re-calculate by just “knowing” that X button is placed at X position.

When building dynamically, you can still work out “where” that item is being placed, because the container you use to place them is still “hard coded” so to speak - placed at a certain point with a certain anchor.

I think you need to work the problem with the override though.
What is likely happening is that the button/image is shifting the focus and “stealing” input.
The override can be used to just completely ignore the image default behaviour(s) and pass the input handling back to the controller (or something along those lines).

Override example for a gamepad:

Note the end portion :wink:

I’m trying to see if your recommendation solves the issue. Worse case, I may have to dive into c++ and wrangle it there, which some colleagues said may be the only solution.

Again, this is kind of out of my field. I’m originally an audio guy with programming background who has transitioned into doing more general coding and design work, so I have never really messed with UI’s aside from making the logic and doing some general gameplay math.

Thanks for the recommendations.

1 Like

Let me know how it goes.

I need to pick my UI marketplace project back up this month.
I’ll try and see if what I have just works for Mobile… it worked just fine last time I tried a mobile release.

Though it’s not an inventory system that has to exist on screen while the controls are also present…

If you share a few screens of the setup (assuming you can) I would like to try to possibly emulate and work around the problem.
(Also good learning).

I figured it out.

Once I clean it up and make them into functions for our use case, I’ll go in and document it so I do not forget it and other people can check it out.

The biggest problem is that I had to wrap my head around how unreal establishes its coordinates, and it ended up that I was doing additional corrective math when it was not necessary.

It also ended up that I needed to be pulling the Cached Geometry. Since it is a UI element that is established, I do not need frame accurate positioning, but Ticked Geometry could also be used for if you need updates.

Again, I’ll be documenting this, because it was a little bit more of a pain in the rear than it is in other engines.