Mobile Gesture Recognizers [Updated + Tutorial]

In the process of making , we’ve been working on a set of gesture recognizers. Today, I updated the GitHub repository at with the most current version. This version makes the recognizers more sophisticated and able to recognize gestures while other gestures are happening. For example, if you are using one finger to pan, you can still tap or swipe with a second finger and have it reliably detected.

The following gesture recognizers are available and ready for use:

UTapGestureRecognizer - a “smart” tap recognizer than can recognize multi-taps (double, triple, etc) without triggering false positives (e.g. no single-tap when double-tap happens), and also supports multi-finger taps (e.g. two-finger double tap).
UInstantTapRecognizer - a not-so-smart but fast recognizer that tells you when any finger touches or leaves the screen. Useful when the delay needed by UTapGestureRecognizer for recognizing multi-taps might be a problem in terms of game play
USwipeGestureRecognizer - a smart swipe recognizer that can recognize horizontal, vertical, diagonal, and edge swipes.

The following gesture recognizers mostly work, but still needs some love:

UPanGestureRecognizer - allows you to get notified when fingers move on the screen. Works well, but right now, only a single pan recognizer can be used reliably.
UPinchRotateGestureRecognizer - A gesture recognizer that notifies when a two-finger pinch or rotate gesture is happening

There are also two base classes (UGestureRecognizerComponent and UDynamicGestureRecognizer) that handle much of the work involved in gesture recognition and which can be subclassed for creating new gestures.

Future plans include moving the touch-processing code to a shared class to reduce redundant processing, getting the pinch/rotate gesture working better, and making sure that multiple pan gestures can be used at once.

These are all written in C++, but they are designed to be used from Blueprint. A lot of people working in Blueprint-only projects have asked me how to use them, because copying C++ classes between projects can be a little intimidating and projects started from a Blueprint template are not configured to use C++ classes at all. So, until Epic gives us a way to easily package C++, I thought I’d throw together a quick tutorial for people interested in using the gesture recognizers in a Blueprint project.

Setting Up the Project to Use the Gesture Recognizers

  1. Make sure your machine is set up to compile C++. On Windows, that means installing Visual Studio (although, I understand it will be included in UE4 soon). On Mac OS, it means installing Xcode from the Mac App Store or Apple Developer Center. Refer to the UE4 documentation for the steps to set up your machine.
  2. Get a copy of the gesture recognizer source code. You can do that either by cloning the repo, or by selecting “Download Zip”. There are two folders in the repo. The folder labeled Gesture Recognizer Source contains the C++ files. The other folder, GestureSample contains a sample project you can use as a reference. It’s the one I made writing this tutorial, actually.
  3. If your project is already set up to compile C++, then you just need to copy the files from Gesture Recognizer Source into your project’s Source folder in the desired place in the folder hierarchy.
  4. If your project isn’t set up for C++, then you need to open your project in the Unreal Editor and select Add Code to Project from the File menu of the main window. This next step is a little hokey, and if anyone knows a better way, please let me know. We need to add a class we won’t actually use in order to get the Editor to configure the project for compiling code. Select a parent class of None. Type a file name or just leave it at the default and hit Create Class. Once it finishes, copy the the source code files from Gesture Recognizer Source into the Source folder, in the subfolder named after your project (e.g. [FONT=Courier New][Project Folder]/Source/[Project Name]/) . You can also delete the class you created earlier.
  5. Each of the .h files you just copied has a line that looks like this: [FONT=Courier New]#include “%%%PROJECTHEADER%%%”. Every project that supports C++ has a project header file that must be included. Use Find & Replace to replace [FONT=Courier New]%%%PROJECTHEADER%%% with the name of your project header, which should be [FONT=Courier New] [projectname].h. In my case, [FONT=Courier New]#include “%%%PROJECTHEADER%%%” became [FONT=Courier New]#include “GestureSample.h” in all the copied .cpp files.
  6. In the File Explorer (Windows) or Finder (Mac), find your .uproject file and right-click on it. The contextual menu should have an option to “Generate Xcode Project” or “Generate Visual Studio Project”. This will update the project to include the files you just copied into your project. You won’t actually have to open the Xcode or Visual Studio project unless you want to modify code, but this updates all the files used to build the code parts of the project.
  7. Open your project in the Unreal Editor again, and you should be ready to go. If it asks you to compile the project, answer yes. If it doesn’t ask you, it should be ready to go, but you can hit the new “Compile” button to make sure. Compile builds the C++ code much the way the Build button builds Blueprint.

Using the Gesture Recognizers

Using them is straightforward. They are implemented as actor components, so you can simply add the appropriate component to the actor blueprint where you want to receive the input. To receive a tap gesture, for example, simply add the Tap Gesture Recognizer to the actor:

http://3.t.imgbox.com/OXjtrHUS.jpg

Now configure the component to your liking. If you want only double-taps with one finger, set both the minimum and maximum tap count to 2 and the minimum and maximum touch count to 1, like so:

http://0.t.imgbox.com/JHs56Q9y.jpg

Next, for whichever delegate method you want to implement, click the corresponding green button. The tap gesture only has a “Gesture Ended” delegate, so press that one to add the event to your event graph.

http://9.t.imgbox.com/imj70QJW.jpg

In the event, you should cast the gesture recognizer to the actual type, since the delegates are generic. Then, simply query the recognizer for the information you need. Here’s how we would get the screen location of the tap and print it to the screen:

http://0.t.imgbox.com/kJV9w8vq.jpg

Now, run the project. If you double-tap, the coordinates of the tap will be printed to the screen. If you single-tap, or triple-tap, or do a two-finger double tap, it won’t, because those tap gestures are not within the parameters you set up. Each recognizer type has a set of functions and/or properties in the category “Gestures|Results” that will give you the information you need. All gesture delegates also give you DeltaTime as a convenience.

If you have any questions, please feel free to ask. There are no restrictions on the use of these recognizers, but I welcome additions, modifications, or bug fixes.

Thanks!

Just a note that I pushed some changes this morning so that all tolerances and limits are now based on device-independent points. If you say a swipe must be 50 pixels long, it will no longer use device pixels (which would result in a huge difference between, say, an iPhone 6+ and an iPad 2), but rather it will scale the values based on the screen density. Most recognizers now support giving you data in either pixels or points.

Hi ,

Thanks for sharing this!

Maybe this implementation will help you. Good luck
https://github.com/TouchScript/TouchScript/tree/master/TouchScript

Just wanted to post an update. I’ve had several Blueprint users ask how they can use this. I finally found time to package the mobile gesture recognizers up into a plugin that BP-only users can install and use. You can find more information, along with video showing how to install and use the plugin, in this thread.

Thank you for sharing this with us. That’s pretty neat.

Ive noticed that the latest version has binaries for Android, but these dont seem to work properly, when I try to build or package out to Android I get the following error:

RunUAT.bat ERROR: AutomationTool was unable to run successfully.

Im not sure if Im doing something wrong or if the Android binary just hasnt been tested properly yet, please any help on the matter would be immensely appreciated.

Hi, I have fixed the errors and now it compiles against UE 4.10+ , but looks like the gestures are not being recognized…
Repo Link