No Blueprint Touch Gesture support?

I’ve been hunting around all day trying to find how to get gestures working in my iOS project (based off Monokkel’s lovely TBS toolkit). I’ve taught myself the iOS SDK, and Cocos2D (and Kobold2D). They all have fairly intuitive systems for gesture detection (single tap vs double, pinch zoom vs two finger scroll). I don’t seem to be able to find anything remotely close to that, even at the c++ level. Am I missing something or is each iOS developer really expected to re-invent the wheel so to speak? I was actually very surprised that there wasn’t a whole suite of blueprint nodes to do this job already. Most engines that advertise mobile support include the usual bunch of gestures right out the gate. Everything I’ve found so far indicates that I have to build that all myself. This seems like a significant oversight. Am I wrong? Is this at least something on the roadmap? If not, shouldn’t it be?

J^2

Hey ,

When you’re in the Editor, go to Edit > Project Settings > Engine > Input. Once you’re under the Engine - Input tab, you can use the Mobile section for specific gestures in your project. You can also add ‘console keys’. Once you’re in ‘console keys’ expand the ‘keyboard’ section and scroll to the bottom and you’ll see options for gestures. I hope this helps.

Here is a guide: Input in Unreal Engine | Unreal Engine 5.1 Documentation

Thanks!

Thank you for that pointer. I did find touch based gestures in those menus. I’m still at a loss as to how they are used. I can’t seem to find any tutorials, or even documentation on how these are used. I would have expected a whole section of the Mobile Development area of the documentation to discuss how to work with touch based input since it is unique to that platform. Yet, there is nothing there.

The only thing in the docs I can find about a touch interface at all is how to set up the virtual joysticks. Which is great if that’s the kind of game you are building. Yet, people still need to make menus, buttons, pinch to zoom, and all the other basic interactions that have become so commonplace to using a touch based device that their absence is rather conspicuous. I can find nothing in the docs about such things. In fact, doing a search for touch input on the docs show 5 or more questions on how to use them on the Answer Hub to every one documentation entry. This would imply to me a very strong need for better documentation and some tutorials on how to build a full touch based interface that doesn’t rely on the virtual joysticks, especially since most mobiles games don’t use such things.

Essentially, what I’m asking for here is not the location of the variables but how to use them in a practical application. With UE4 going free, you are going to be seeing a lot more first time people wishing to make iOS games asking these same questions. I think it would be time well spent to have someone in the know work up a batch of tutorials on how to get a non-joystick based touch interface working for mobile users. Heck, with Win 8, it would apply even to PC users…

Is this something I should be posting in the Feedback section perhaps?

J^2

I’m personally using the inputTouch node in combination with ‘get input touch state’ and its X + Y coords. All in all for a simple swipe to move camera around and a simple follow pawn with camera option there’s about 50 nodes or so. I’m not sure if that sounds like a lot but it’s quick to make, and also has some fancy ease in/out motions etc and checks and so on…

There might be something on the market place if you’re interested in a more pre-built solution. I don’t know of any tutorial specific to this but there’s plenty on menues (aka UMG).

I’d suggest asking here for specifics as well!

Good luck.

Wow! I had no idea that there was so much stuff hidden in the project settings!

I do see some swipes in the Keyboard section but it seems there is only a “Swipe from Left to Right” but no “Swipe from Right to Left”

I’ve written some gesture recognizers in C++ as actor components. Though written in C++, they’re designed to be used from blueprint. Add one (or more) to a blueprint, and you can then receive events when a gesture meeting your specified requirements occurs.

It’s still a WIP, but most of the gestures are solid. The pinch/rotate still has some issues that need to be resolved, but the rest are functional. The project has a problem compiling on iOS right now, but you should be able to copy the C++ classes to another project no problem. They’re compiling fine in . I think the problem comes from the fact that the project include “GestureRecognizer.h” clashes with an engine class file name.