how to ALWAYS get touch position on the screen

There is a node called “Get input touch state”. This node gives you the 2d vector of the touch position.
Maybe this helps you.

2 Likes

I have a mobile game im working on, Ive noticed the player controller handles touch input, but if i click on a button in a widget, the controllers touch inputs no longer fire.

I have also noticed that touch events in widgest like touch started, touch move, and touch end do not fire unless the finger was touched on the screen somewhere other than the widget first, making them sort of useless…

I need to get the fingers viewport screen position 100% of the time no matter what. It can come from the widget or the player controller, doesnt matter. but the problem is, if a widget is clicked the controller cant click in the world. and if a button is clicked, the controller cant handle input at all.

I just simply want a blueprint to ALWAYS get the touch location on the viewport, and then it can send that to the gui and/or player controller. How is this possible?

this is extremely helpful thank you. However, the problem persists (although ive now fine tuned it down to this) if the finger or mouse is over a button, all bets are off, this node immediately stops working in every blueprint, including the widget that has the button in it.

I set it up like this and it’s working until i actively start pressing a button in my widget. I don’t know if you can prevent this from happening.