Build Type: Unreal Launcher and from Github
For some reason on windows devices when button is touched it fires two click events instead of one. It also calls
It can be reproduced both on real device with Windows 10 and Windows Simulator. I used blank project and simple user widget with
You don’t even have to run the game, the same thing happens in Editor.
After doing some digging I found a known issue with what I believe to be the same root cause. I have provided a link below to the public tracker. Please feel free to use the link provided for future updates.
Link: Unreal Engine Issues and Bug Tracker (UE-22105)
Make it a great day
I have pushed it to public. It should be available now.
Bug can be reproduced on 4.8, so I don’t think this bug and UE-22105 have the same root cause
I have a few questions for you that will help narrow down what issue it is that you are experiencing.
- Can you reproduce this issue in a clean project?
- If so, could you provide a detailed list of steps to reproduce this issue on our end?
- Could you provide screen shots of any settings/blueprints/widgets that may be involved with this issue?
After checking once again I discovered that bug can’t be reproduced on 4.8, but also it can’t be reproduced on 4.9 and all later versions until 4.13.
Windows Multitouch was added in 4.14 and it, in my opinion, broke the touch event.
As for questions:
1 . Yes, it can be reproduced in a clean project, as I said before you don’t even have to launch the game to reproduce the bug
2 . Launch windows simulator or windows 10 with touch screen, open editor with 4.15 version, touch any dropdown button, it will do nothing because click happens twice, so it opens and then immediately closes the dropdown. Alternatively you can create user widget with button add Print String to OnClicked event, then add it to the viewport (exact level blueprint used:
). When you run the game and touch the button it will print the message twice
3 . I don’t think any particular widget is to blame, I think it has something to do with the way windows fires its input events: when user touches the screen it every time fires these events in this order:
Left mouse button down
Left mouse button up
Before 4.14 windows touch events weren’t handled but now they are. Simple workaround I found (I don’t know how well it works yet) is to not handle mouse events while touch hasn’t ended (had to change engine source code for that)
Thank you for the additional information. You were correct, The first issue that I linked was incorrect. After doing some digging I was able to find a report that shows the exact issue that you have reported here. I have provided a link below
Link: Unreal Engine Issues and Bug Tracker (UE-43213)
Make it a great day
Hey! I’m having the same problem. At what point do you intercept the input call and block it? In PlayerController.Cpp?
You have to dive a bit deeper and intercept it in FSlateApplication, and for that you need to build engine from source code. I’ll post my workaround later
Is there a link to your workaround?
I made a pull request with fix, and it goes one layer deeper - straight to FWindowsApplication
If you set
ShowMouseCursor to false in Player Controller class defaults, event should fire only once.
I am still getting the same buggy behaviour on a windows 10 touch machine using 4.17.1.
(just confirming - seeing this problem still as well)
Oh ■■■■… I just wanted to convert my project to 4.17 version to get properly working touch functions Seems will have to use my hacks with delay again
After running a few tests on our end. I found that this issue is no longer occurring unless “Use Mouse for Touch” is enabled in the project settings. I would suggest ensuring that this setting is disabled. I hope that this information helps.
Make it a great day
I think this would still deserve to be fixed when ‘Use mouse for Touch’ is selected. We rely on the feature to support both mouse and touch interaction against the same application easily.
Now it seems our alternative is to build an abstraction and process touch and mouse on separate code paths at application level.
The API between the two is inconsistent (there is Input Touch event for ‘Moved’ but no mouse ‘Move’ event for instance, also mouse events are per button when touch events give you a finger index) … nothing impossible to work around but unfortunately more work to do for the developer.