For some reason on windows devices when button is touched it fires two click events instead of one. It also calls SButton::OnMouseButtonDown and SButton::OnMouseButtonUp twice.
It can be reproduced both on real device with Windows 10 and Windows Simulator. I used blank project and simple user widget with UButton.
After doing some digging I found a known issue with what I believe to be the same root cause. I have provided a link below to the public tracker. Please feel free to use the link provided for future updates.
After checking once again I discovered that bug canāt be reproduced on 4.8, but also it canāt be reproduced on 4.9 and all later versions until 4.13.
Windows Multitouch was added in 4.14 and it, in my opinion, broke the touch event.
As for questions:
1 . Yes, it can be reproduced in a clean project, as I said before you donāt even have to launch the game to reproduce the bug
2 . Launch windows simulator or windows 10 with touch screen, open editor with 4.15 version, touch any dropdown button, it will do nothing because click happens twice, so it opens and then immediately closes the dropdown. Alternatively you can create user widget with button add Print String to OnClicked event, then add it to the viewport (exact level blueprint used:
). When you run the game and touch the button it will print the message twice
3 . I donāt think any particular widget is to blame, I think it has something to do with the way windows fires its input events: when user touches the screen it every time fires these events in this order:
Touch started
Touch ended
Left mouse button down
Left mouse button up
Before 4.14 windows touch events werenāt handled but now they are. Simple workaround I found (I donāt know how well it works yet) is to not handle mouse events while touch hasnāt ended (had to change engine source code for that)
Thank you for the additional information. You were correct, The first issue that I linked was incorrect. After doing some digging I was able to find a report that shows the exact issue that you have reported here. I have provided a link below
You have to dive a bit deeper and intercept it in FSlateApplication, and for that you need to build engine from source code. Iāll post my workaround later
Oh ā ā ā ā ā¦ I just wanted to convert my project to 4.17 version to get properly working touch functions Seems will have to use my hacks with delay again
After running a few tests on our end. I found that this issue is no longer occurring unless āUse Mouse for Touchā is enabled in the project settings. I would suggest ensuring that this setting is disabled. I hope that this information helps.
I think this would still deserve to be fixed when āUse mouse for Touchā is selected. We rely on the feature to support both mouse and touch interaction against the same application easily.
Now it seems our alternative is to build an abstraction and process touch and mouse on separate code paths at application level.
The API between the two is inconsistent (there is Input Touch event for āMovedā but no mouse āMoveā event for instance, also mouse events are per button when touch events give you a finger index) ā¦ nothing impossible to work around but unfortunately more work to do for the developer.