UMG and touch - how ?

My class is in the final week of making a mobile game at school. Assets, AI and such have been made and it’s time to make it a complete game where you can touch the screen and things will in fact happen.
It’s quite annoying I can’t find any tutorials or info on simple things like: how can I make a main menu with UMG which has a button to start playing, using touch?

We also have a in-game HUD which needs touch on two buttons. These buttons are only enabled/visible when you either have collected an item or you’re close enough to a specific asset.

I feel I’ve wasted two days looking for info, but nothing relevant have shown up. I’ve found tons of others searching for the same information though.

You have hundreds of umg tutorials on youtube!

Umg is probably the easiest thing to do in a game, just takes time to make it look good!

Good luck!

UMG itself isn’t the issue. Making it work with touch on an android tablet seems to be less straight forward than we’d hope.

We’re trying to make it possible to have a button, overlay on top of the game while playing.

Well yeah, just add widget on begin play in that widget add a button that does whatever, it is by default touchscreen enabled unless you have to activate something in options, but i am working on an android game, example, i made a pause button while im playing game i can touch that button and it pauses the game, i don’t remember changing anything else, straight forward, unless i’m not understanding your question correctly? ^^

You create a widget blueprint. In it you create your buttons and hook them up to whatever you want to do. Now in your character, level or whatever bp you want to use, you use node “create widget”. In it you select the widget you created. Now you drag a pin from “create widget” return node and get “add to viewport” node.
Run this with begin play or whatever. Bam you are done.

I’m beginning to think there’s something wrong with 4.10.2 because none of this works.

There indeed is not easy way to make simple button with touch interface.
Like one that registers press, or when you drag touch outside button area, or when you drag finger in etc.
There is even more, if you have multiple fingers, and when you drag one out and drag other in, nothing in umg can register this.

For buttons we ended with pooling all touch locations then calculating if one is in button area.

Only thing we are lacking is some easy way to store information about what area of screen has which function. This kind of requires storing touch map as texture (so it is easy to see zones in editor, or display overlay to test touch events), then pulling id from pixel color at touch coordinates. But that can be done only in C++. So instead we just calculated location (and touch zone ID) with vector math.

UMG for touch interface is useless. It can be used only to display hud, and even then sometimes it inhibits touch if you forget to change default properties of widgets.

Hi guys. I had a final go at this today. I’m so fed up. Mostly because it’s such an easy fix…
All year we’ve been taught to use “horizontal box” in UMG to easily make buttons space evenly and such. Guess what. Moving the buttons out of the horizontal box solved it. It’s as if the horizontal box blocks the buttons when it comes to touch. What do I know.

Thanks a billion for your effort and replies. Proves this community is all about helping. Appreciate it!

There is setting in widgets that tells them to not consume touch or something like that.

But how will it work for touch???
Add a button in widget blueprint but, how to add its touch functionality? Do I have to just set onPressed ? Will it also work for touch?

Thank you


Please also tell me how your problem is solved… I am suffering from same problem… I am not able to make touch buttons .please help

Thank you

If you’re running into the issue of confirming that what you’ve tried to write is working within the UE4 Editor, make sure that you have “Use Mouch as Touch” enabled in your Project Settings. Not sure if this is your issue, but it’s an easy setting to miss.

Like to know that too .

You just need to set the Visibility of the Horizontal Box to “Not Hit - Testable (Self Only)” and this will mean you can’t touch the HB and instead you will touch what is inside it (your buttons). You need to set the same visibility for all of your containers. It should be set this way by default.