UMG Widgets preventing Touch Released events from firing

I’ve attached a video and logs that can help illustrate the issue.

Essentially interactable widgets sort of “eat” the Released events for touches. It’s not every time, but frequently enough that tracking gestures is a challenge, and in time we hit index 10 and can’t interact with touch any more.

We can work around this with buttons by setting Touch Method to anything but Down and Up, but we can’t find any reasonable workarounds for other widget types.

We know we can reimplement interactable widgets ourselves to work around this, but our widget tree is enormous and it would be an undertaking we’d like to avoid.

We will be deploying on a Linux AIO, so we need to get this sorted on Linux - we can’t deploy on Windows instead.

I’ve seen some related issues here:

[Content removed]

[Content removed]

[Content removed]

[Content removed]

[Content removed]

[Content removed]

We’ve been experimenting with our own IInputProcessor to intercept touch events so they don’t get eaten, but we’re not sure how to send a “click” event to widgets that should receive them this way. If we return “false” to let UMG handle it, we end up in the same position when the Touch Released event is eaten.

Another solution I’ve seen is to comment out a certain line in SlateApplication, but we’d prefer not to make engine edits.

Do you have any other thoughts about how we can work within the parameters of the launcher’s 5.4 to allow us to track touch gestures such as swipe and pinch while also allowing interactable widgets to function normally responding to touch?

Steps to Reproduce

  1. Open TouchOrbit project, run PIE or Standalone or build shipping
  2. Touch and drag to make camera orbit, note the prints to screen. Touch 1 is being pressed and moved. Remove the finger and note a print for Released Index 1.
  3. Tap a few widgets on the screen, usually the combobox gives the most trouble.
  4. Try to touch and drag again outside of a widget. Note the touch index has increased despite only having one finger down,

Hi Jared,

It sounds like this may not be helpful for your current project (unless you’re a ways out from shipping and will be taking engine upgrades), but we’re in the process of integrating SDL3 to resolve a number of Linux input issues and remove some hacks that we’ve added over the years. One of the main priorities of the upgrade is better handling for touch support (which has been spotty on Linux for some time), so that may be coming with 5.7 if we’re able to iron out all of the issues.

For a more short-term workaround, have you been able to determine the conditions where the release event isn’t sent? If you enable the Slate Debugger, do you see the release event at all? Or is it never sent by SDL? If it’s related to the gesture processing in some of the linked posts, perhaps your input preprocessor could detect situations where an event leaks (finger up is never called) and manually send it to counteract the rising touch ID count. That may be tricky if you support multitouch since you’d need to determine the difference between a leaked ID and a finger that is still being held down, but if you exceed a certain finger count then you may be able to flush all of the active IDs without being too disruptive. Ideally we’d be able to detect the SDL gesture events and handle them properly (or at least cancel any hanging TouchDown events) but I fear that won’t be possible without engine changes.

Best,

Cody

Hi [mention removed]​ , thank you for getting back to me on this.

I’ve attached some slate debugger logs and Touch End is not being called a majority of the time - the events just get eaten. It’s only after I begin trying to interact with interactable widgets like buttons, comboboxes, sliders, etc. As I noted in the original question, I can get around this with buttons by setting Touch Method to anything but Down and Up, but I can’t find any reasonable workarounds for other widget types.

How would I go about manually sending touch end events from my input processor? Where would I send such an event and to which object(s)? How could I flush the held fingers? These workarounds might be as much as we need for now, I’m just not sure how to execute them.

Hi,

Grabbing main should be sufficient, we switched to using SDL3 by default so there’s no setting that you’d need to change. If I’m remembering correctly, our previous investigation pointed to SDL_gesture as a potential cause for the missing Release events (since it was doing some gesture processing and sending different events that we don’t handle). That gesture handling code is completely gone in SDL3, so I was hoping that would improve things, though it’s still going through QA and it would be unsurprising if there were some other bugs to stamp out.

There’s a bit of logging in FLinuxApplication, you can set LogLinuxWindow and LogLinuxWindowEvent to Verbose to see which events we’re receiving from SDL at various points. In SDL2, we were seeing event 2050 (0x802) when the release event was consumed, which corresponded to SDL_MULTIGESTURE. I see that event has indeed been totally removed in SDL3, so running with verbose logging might tell us whether something odd is going on or if it might be a different issue entirely.

Best,

Cody

[mention removed]​ I don’t mean to rush you, but do you have any thoughts on my recent post? Thank you!

CC’ing [mention removed]​ from our organization as well

Hi,

So sorry for the delay. Trying to manually flush touch events may be more error-prone than it’s worth, but a similar issue came up on [this [Content removed] and we were able to work around it by changing the default mouse cursor to None in the player controller. Does that work for you? It’s not ideal since you’ll need to reenable it when using the mouse, but that may be the quickest path forward until we’re able to roll out SDL3.

Best,

Cody

[mention removed]​ to follow up here, we see SDL3 in the public github’s ue5-main branch - if we pull and build that, is it expected that SDL3 will be the backend used for touch on Linux out of the box, or is there a setting that must be set? Or is ue5-main not current enough for the fixes we would need?

I’ve built this branch locally to test my test project against, but we’re coming up against the same issues.

Thanks Cody - what I’ve found is the culprit seems to be mouse capture. On Linux at least, when a widget captures the mouse, it prevents the FINGERUP events from firing for that finger index in FLinuxApplication.

We can take a scorched earth approach here and remove mouse capture from any interactive widget events, but we were wondering if you had a better idea for how to handle this - our first release is fast approaching, so if we can’t fix this at its core we at least need a workaround. If this is a reasonable course of action, do you have any thoughts about how it may affect our UI generally in a negative way?

Hi Jared,

A few avenues you can try here:

  1. We have a CVar LinuxApplication.EnableSimulatedMouseFromTouch that you can disable to prevent those mouse events, at which point you should *only* see the touch events in your log. I’ve tried toggling this in the past and still had issues, but it’s quick enough to test out and see if that improves things. We see similar issues with Windows touchscreens, they send both a native touch event and a simulated click (to make non-touch apps work properly) and that can cause all sorts of weird behaviors
  2. There are two big sources of mouse capture that you’re probably seeing- the viewport and any button widgets. The viewport will capture the cursor by default, but you can change that setting in Project Settings under “Default Viewport Mouse Capture Mode”. Other things, like setting the input mode, may adjust that setting at runtime. For button widgets, they only capture the mouse if set to “Down and Up” for their click/touch method, which we do to ensure you don’t need to remain hovered on the button between the press and the release. You can avoid that by using the other mode (Down or Precise Click/Tap).
  3. If capture is the culprit here then you should see the events come into LinuxApplication (with the logging mentioned before) and make it’s way through to FSlateApplication::ProcessTouchEndedEvent, at which point it’s handed off to the Mouse Up handler. If that’s not happening, perhaps something is wrong with FLinuxApplication::UpdateMouseCaptureWindow and we’re sending the event to the wrong window altogether. We should add some additional logging in there to get a better picture of what’s going on

A bit of a theory, I wonder if the synthesized mouse up event is causing capture to be released via SDL_CaptureMouse(false), and that interferes with the Finger Up event since we’ve already released capture. Hard to be sure without some hardware to test on, but that seems plausible (in which case the cvar in point 1 might help)

Thanks Cody.

  1. Thanks for calling out the simulated mouse cvar. It was worth a try but ultimately it did not solve our issue. It does at least solve the mystery of why we get mouse click events on touch.
  2. We’re seeing mouse capture from multiple widgets in our application, which is essentially just a full-screen uwidget with some render targets. Buttons, sliders, comboboxes, checkboxes, etc. Buttons and checkboxes are fairly easy to work around, by, like you said, using the Down only or Precise Tap input types, but nothing like that exists for the other widget types as far as I can tell. Comboboxes specifically seem to be the hardest to get around without editing the source. But really we’d prefer not to have to do this unless we can automate it - our widget tree is multiple layers of nesting deep, so going through and setting the input type for buttons even is a huge manual task.
  3. What’s really strange is that we DON’T see the SDL_FINGERUP case being called for the affected finger even in FLinuxApplication. None of the logging events in the SDL_FINGERUP case in FLinuxApplication::ProcessDeferredMessage occur for the affected finger index in this block after mouse capture. They do occur if mouse capture is not routed, and they occur for subsequent touches. That is to say Index 0’s FINGERUP events never arrive after mouse capture, so subsequent touches assume Index 1, up to Index 9. If Mouse Capture is routed during a touch event 10 times, unreal will just stop listening for touch events entirely. I know that limit is imposed by unreal, not SDL. But the FINGERUP events are being blocked for some reason.

Hi,

Yeah, your third point there is in line with what I’ve seen reported in the past. It tends to come up every few months and we haven’t locked down a solid repro case or cause yet, so apologies if there’s a lot of speculation and not much in ways of a concrete solution.

Were you able to try setting the default mouse cursor to None in the player controller? It’s a bit unclear if the missing FingerUp events are caused specifically by viewport capture or by any capture at all, though I suspect that workaround has primarily worked in cases where there wasn’t much/any UI present so maybe that’s why it was sufficient. If any capture whatsoever is causing the event to leak, that’s a bit trickier to handle without engine changes. I see two points where we call into SDL to change the capture target- FLinuxApplication::UpdateMouseCaptureWindow and UngrabAllInputImpl. If you’re able to temporarily make engine changes, it would be interesting to comment out all of those SDL_CaptureMouse calls and see if that fixes things in your build. It would certainly mess some things up in the Linux Editor (if you’re using that), but it’s a good datapoint nonetheless.

Another potential factor here, we skip capturing the mouse if the cursor is hidden (but we still release it) in FLinuxApplication::UpdateMouseCaptureWindow:

bool bShouldGrab = (IS_PROGRAM != 0 || WITH_ENGINE != 0 || GIsEditor) && !LinuxCursor->IsHidden();This might explain why the workaround of setting the mouse cursor to None was effective, so I’d definitely be interested in hearing if that does anything on your end. If its still misbehaving but you’re able to temporarily make some engine changes, adding logging next to the SDL_ShowCursor calls could help us figure out what leads up to the failure state where an event isn’t sent.

Best,

Cody