Hey there community folks,
I have been working on this project for some time. Where I am stuck is basically, I have added some on touch end nodes to be processed on my blueprint static mesh components(basically a door opening animation fires off using a timeline node whenever the SMC is tapped). They work perfectly on simulating touch using mouse on Windows. However as soon as I port it on Android, all of sudden the touch starts to misbehave. Touch get’s registered that I can confirm as I have crosschecked it using a print string node, but still I have to tap multiple times in order for the event to get executed, however, on Windows while using mouse as touch, it behaves properly. What am I doing wrong here? How do I solve this?
Hey there @Gudakesh! I’m thinking this is going to be one of the quirks of the touch system. So inputs are actually defined by which touch they are. So if in your input system, your primary “Use” key is Touch 1, only the first finger making contact will input, so say if you have your left thumb on the movement stick, well that’s actually touch 1.
However if you’re like me you probably want a bit more control over how to parse the input instead of having it hard assigned to touch 1 or 2 or whatever. Introducing the Touch node! It’s a generic input node which if this were keyboard you’d avoid using, but since it’s mobile, the drawbacks disappear! Now if your problems still persist after swapping to this generic touch (it shouldn’t fire when you touch a standard mobile control), let me know and we can do some diagnostics!
Thank you for you immediate response however I think maybe there is some confusion here.
above you can see how I am using on touch end event to fire off the animation. So there is no primary key that I am using. Still the same problem persists on mobile devices. If touch on the door, the door opens. next if I orbit, the camera using touch then I need to touch multiple times(sometimes 2 as if the first touch is activating the door or more like selecting the door and the second time touch is firing the animation or shall I say the event).
Ahhhh you’re using the direct touch methods. Could it be that the touches are being moved before causing it? Are there any UI elements that could consume the input? If you can put some breakpoints on the branch and a watch on the touch moved bool?
It’s possible the device is “jittering” a bit on touches so some might be moving before the release.
I was speculating the same thing. What could be the possible ways to counter the minor jitters that comes from touching the device without changing much in the blueprint? I don’t want to redo the whole thing again as the delivery date is near and there are a lot of assets that are facing the same issue.
I think one possible way to do this is to preprocess the touch event, before setting the moved bool, delay it for 0.1, then branch or gate it for if the touch has been released or not. This way short taps will likely have 0 chance to fire the move event, downside is that if you have swipe gestures they would also be delayed if you use that moved bool.
There’s likely a less hacky way to pull this off, but this was my first and easiest thought.