I’ve been hunting around all day trying to find how to get gestures working in my iOS project (based off Monokkel’s lovely TBS toolkit). I’ve taught myself the iOS SDK, and Cocos2D (and Kobold2D). They all have fairly intuitive systems for gesture detection (single tap vs double, pinch zoom vs two finger scroll). I don’t seem to be able to find anything remotely close to that, even at the c++ level. Am I missing something or is each iOS developer really expected to re-invent the wheel so to speak? I was actually very surprised that there wasn’t a whole suite of blueprint nodes to do this job already. Most engines that advertise mobile support include the usual bunch of gestures right out the gate. Everything I’ve found so far indicates that I have to build that all myself. This seems like a significant oversight. Am I wrong? Is this at least something on the roadmap? If not, shouldn’t it be?