Hey there,
Actuall what I meant is that right now, it doesn’t seem possible to use the touch events and nodes in blueprint to get touch on the windows platform (unless i’m doing something very wrong).
What I actually tried is to get Tappy Chicken to “jump” not when you touch with the finger index 0 (so the first touch detected on screen) but do it when the finger index is 1 (so the input event fires when the second finger touches the screen while having the first finger still touching it).
The finger at index 0 is recognized in a touch screen with windows only if " Mouse for Touch" property is true in the project settings.
So actually when you touch the screen with one finger on tappy chicken or on my tests with “Use Mouse for Touch” disabled, no touch will be returned, hence the incompatibility with multitouch or any touch actually I was talking about.
Then if you set “Use Mouse for Touch” enabled, the finger at index 0 works when you touch the screen because it thinks you just pressed a mouse button.
The problem with this is that for our company where we do a lot of work on kiosk mode pc’s for high-end architectural vizualisation, and we need multitouch enabled for doing gestures like pinch-to-zoom, or implementing a pan node by dragging two fingers across the screen etc.
For this kind of implementation we would have needed hardware touch support on windows (like on ios platforms) so we can have all touches recognized and distinct so we capture infos like position of first and second finger, delta movement etc, etc…
Anyway for now it’s kind of keeping us at bay from using ue4 on our day-to-day projects and it’s a shame because all the team got really excited on how easy and powerful it has become.
Thanks for your hard work and I will keep monitoring you guys !
Still if you could have a look at this issue we would be really grateful.
,
-GRAPHI