Hi, I have this problem in Unreal (5.2) that is driving me crazy. As described in the following images, the pinch gesture causes a bug in the finger detection. I’m not completely sure what’s going on but it happens when the two fingers are close to each other and Unreal only detects one finger. But it doesn’t always happen and you have to try several times to get it to happen.
The most “worrying” thing for me is that if I stop the simulation in the viewport and restart it, the bug is still there, I have to completely restart Unreal for the problem to disappear.
I have encountered the problem in the viewport and in standalone game mode and mouse for touch is unchecked.
Does anyone have any idea what’s going on? Is this a bug in Unreal?
Some additional information after doing some tests:
I was able to reproduce the problem in an empty project. It appears that pinching is not the cause of the problem, I was able to reproduce it by pressing randomly and quickly with several fingers on the screen. It appears that after a while, two fingers placed on the screen are no longer identified as “touch1” and “touch2” but as touch 3 or 4 or 5 etc. I even managed to get as far as touch10, after which Unreal no longer received any inputs.
Another piece of information, if my final project will use a dedicated touch screen on a Windows PC, I use for my tests an android tablet connected to the PC via the SpaceDesk software. I don’t know if this configuration can be related to the problem, everything seems to work fine if you exclude this problem.
It’s obviously very problematic because as soon as touch 1 and touch 2 are not identified correctly, the pawn is no longer usable. The only workaround I can think of right now would be to reset the finger detection when the inputs are released, but I have no idea how to do that.
I still haven’t found a solution. I’ve done a test on another PC but the result is the same. At the very least, I’d like to know if this problem is due to the use of an Android tablet via spacedesk. Or maybe it’s linked in some way to Windows 11 touch management.
My test project is very simple, it includes a pawn with a camera and a “Touch” node whose Finger Index output is connected to a “Print String” . I use “move” pin for the execution. I also have a GameMode for which I indicate my Pawn. In the project settings, I disable “use mouse for touch” and mouse capture.
To make the movement on the screen, you have to quickly place your fingers on the screen one after the other and start again. The movement that an impatient person would make on a table, if you know what I mean.
If someone with a touch screen could do this test it would at least tell me whether the problem is with my setup or not.