I’m sorry if this is a silly question, but I don’t have a clue on how I can do this.
Basically I want to create a skill input that you hold the button and release it in a direction in the HUD and the character fires it in a direction in the world.
And I would also like to know how to make a skill that doesn’t have a direction but a point on the ground, like Ziggs’ ‘W’ or ‘E’ (from League of Legends). Didn’t find a reference in a mobile game but I hope the explanation is enough. Actually this one sounds harder to do than the first one.
For a direction based on a UI object,you could override the OnTouchStart and OnTouchEnd functions in its widget, set the location of each as a variable, then find a relative lookat rotation from them and pass it along to actor, adding or subtracting as needed to line up with world space.
Finding direction when touching something in the world should probably be easier. If you have an input event setup for touch, you can use a GetHitResultUnderFinger node to get the world location and find the lookat rotation from the character.
Ohh thank you. That was giving me headache. The thing I struggle the most with is finding the right nodes. Today I found the ‘atan2 (Degrees)’ node and was about to do everything by hand.
Do you have any idea on how to make the second skill input? I imagine that I would have to get the ‘TouchInputStart’ and ‘TouchInputEnd’ in a normalized area around the button and translate that to 3D inside a normalized area that is the range of the skill. But I’m not sure how to do the ‘normalized area’ part, if that’s even a solution.
For a direction base on the ability button, in theory you could use the get cached geometry node on the button widget, get the pixel position from that, and use this as the start loc for find lookat rotation and touch location for the end. This would likely get a bit wonky since you’d need to a bunch of maths on the x and y to control for the offset of images and such. Might be easier to use the mobile virtual joysticks to control an aimed ability.
Here’s a super dirty attempt I made at a touch aim thing. Sorry it’s a bit of a noodley mess, but it mostly works.
In my case the widget component var at the top left is a 200x200 circle image, and my player char has the camera rotated by 45, so adjust the addition nodes as needed. The “shootty” function being called is just spawning a projectile actor using the input to set the spawn rotation.
I don’t know if I got it right, but that’s a solution for the first problem, right? Now it’s working in my project thanks to what you said.
When I press the button it spawns an actor that acts like the aim and this event keeps updating that actor rotation each frame so you can aim properly (I did it for the mouse but will do for mobile later):
Now I’ve got the next problem, but I think I might have the solution. I will get the direction between the ‘starttouch’ and ‘endtouch’ using the ‘look at’ node and also the distance using the ‘Distance 2D’ node and use that in the world.
Thanks again. It’s very hard to find what nodes to use in each situation. I searched a lot and still don’t know exactly what the ‘Get Cached Geometry’ does but I’ll learn more about it in case I need it later.
For the relative distance, after you get the distance between start and end touch, you could divide that by the radius of the aim circle to roughly normalize it, then multiply the result by the max range of abilty.
The cached geometry node does seem a bit obscure. I’m not that familiar with it either, but as used above, it’s just a way to fetch info about a widget panel from the layout system.
I added the things you said and it works perfectly, thanks. I wouldn’t be able to pull this off alone. It’s done every tick and it does some math, which may be heavy but it’s the solution for now.