I’m working on a spell system like one you find on mobile MOBAs, where you basically touch and hold your finger on a spell button, drag it to aim the direction and cast the skill/spell on release.
How should I do that?
I’m currently at the point where i can aim with right thumbstick virtual joystick and the character follow it’s direction, but i cannot seem to find a way to trigger an action when the button is released.
Last time i did anything with touch support and mobile games was about a year ago. Maybe epic improved it since then, but i doubt.
Back then UMG was not made for touch interface.
For eg. i could not implement properly working fire button. That means no matter which finger i touch it with it will fire. Touch interface tracks each touch spot and numbers them, there were ways to touch button without registering it as new event. And when you slide finger outside umg widget it will not register as end of touch.
Simply put tracking touch events was full of exceptions from rules, that code must watch to be reliable.
I ended writing my own code for virtual pad and fire buttons. And unless they added better touch support i suggest you the same.
- first learn how scaling of screen and umg works on mobiles. Make your code adapt to it. You can define different resolutions for virtual device while running in editor. Make your code work properly in all resolutions.
- then scale touch coordinates to some chosen resolution, i used 1920x1080. No matter of device resolution i always scaled touch input to 1080p
- last you need some functions that are run before umg touch interface. You need to calculate if touch is in area of fire buttons, and in area of virtual pad. Then do actions for that input, and last animate underlying umg widget.
That is how i did my touch interface after month or two fighting with all quirks of umg.