Article written by Ryan B.
At the OS-level, iOS handles touch input and enforces a small deadzone. Normally this is fine, but it can prevent very small-change touch inputs.
The fix we found to address this issue is driving most of our input through widgets. To handle our touch input, we just use the touch moved events, which happen after
OnTouchFirstMove, so that way the OS shouldn’t be filtering things anymore and should allow for more precise control.
Glad to see this topic, I just ran into this problem.
I’m working on an FPS game right now. On iOS devices, due to the deadzone, when I move slightly, first there is no rotation and then there is a jump. I noticed that after receiving the BeginTouch event, the FirstTouchMoved event will be sent after moving a certain number of pixels, then is TouchMoved event.
You said that To handle our touch input, we just use the touch moved events, which happen after `OnTouchFirstMove. Has the 4.24 engine already handled this? Or do I need to handle it myself in the gameplay logic? and use the Touch Location when I receive the FirstTouchMoved event as the Begin Touch Location? And if this is done, the movement between pressing the screen and receiving the FirstMoveTouched event can’t actually be fed back, right?
Or for ios devices, because of the dead zone of the first input, it is impossible to immediately respond to the actual rotation of the viewing angle while touching. Do you have a good solution to this problem?
Looking forward to your reply.