UMG Mouse input on 2 different Widgets

Hey, so I wana have Widget A and Widget B, both added to viewport. Input Mode is supposes to be UI and game, so if no Widget is clicked, the game receives the input. By changing the Z-Order, I can choose which of the 2 widgets receive the Input when I click on one of them. But I want both to be able to Receive Input.

What is happening: Widget A is on the left side of my screen, Widget B on the right. The Widget with the higher Z order will receive input, while the click goes right trough the Buttons of the other Widget.

Desired behavior: Widget A and B both can be clicked, and if the mouse clicks on neither of them, game receives the input.

Any idea how I can fix/ achieve this?

You can control this behaviour by returning Handled / Unhandled replies when overriding mouse events:

  • Handled - the input is consumed and neither the widget underneath or the game will be notified
  • Unhandled - the input is not consumed and will tunnel to the elements underneath, if no other widget Handles it, the input will eventually arrive at the Player Controller a.k.a. the Game (unless we’re in the UI-only input mode, of course)

In addition, I’m intercepting the clicks here, giving this widget a chance to act accordingly, depending on which button was clicked.

In this very case:

  • LMB will call the Clicked event and terminate (handled)
  • RMB will call the Clicked event and carry on, looking for another input recipient (unhandled)
  • other mouse buttons will not call the event and also carry on (unhandled); you could middle-mouse click on a Visible widget and still have the player controller rotate the camera, for example

One note, it gets somewhat tricky with Buttons as they very hungrily consume input. Essentially, if a click lands on a button, the UMG will think you really meant it and just eat it. There’s a feature called Precise Click but I don’t think that’s what you’re after here. Or is it specifically for buttons?

After dealing with UMG for years now, i’m really against using buttons for anything other than simple click & forget or debug. Totally biased.


Widget overrides sit here:

1 Like

Thanks for your great answer! Overriding the mouse handle didn’t change things in my specific case, but I found the culprint now…

I had a Set Visibillity node, which set the Widget to Visible. (Which apparently doesn’t consume the Input for the Game, just makes it invisible for any other Widget). For the described desired result it has to be Not_Hit_Testable.

1 Like

Glad to hear that.

Desired behavior: Widget A and B both can be clicked

I was posting under the impression that the above was the requirement. Setting a widget to Not_Hit_Testable will prevent it from being clickable.

There are 2 options here, Self Only and Self & All Children Not Hit-Testable. They behave differently.

Self only is the only option that makes sense for the described case, yes.

Was my bad to overlook the node, I think I purposely set it to visible back in the day to avoid Drag and Drop conflicts.

Greetings. This is driving me crazy. I am using unhandled on the top widget and handled on the bottom widget, but I can never get a click to tunnel/bubble down.
I have tried stacking them directly in the viewport, in one main widget and also just putting one of them in a holder widget with a component that cover the added widget…
I have no idea why, but I can not in any way get the tunneling down to work!
Please help, I don’t understand what I am doing wrong. :o

To clarify. I am trying to make this work in an as simple test setup as possible:
I have two widgets with unhandled mouseDown Overrides. Both are set to print a debug in each mouseDown override. In the level blueprint I add one first and then the other, with Z order 1 to make sure it ends up on top, to the viewport.
When I click anywhere, the top one triggers the debug print, but when I click on the area where the bottom one is, only the top debug prints. Both should print.
Have tried different game modes and different settings for focusable and visibility in both widgets.
Actually, at one point. when I changed the top one to non-hit testable (it has only a canvas panel), it worked! I got both debug prints when clicking in the area of the bottom widget. But repeating this doesn’t work. Seems to be a one time fluke.
Can there be a bug in the engine here?

In fact I was trying to reproduce what this video shows, just to get my head around it:

Unfortunately he doesn’t describe the cruxial part of how he actually stacks the widgets…

Anyone else who can make this work in Unreal Engine 5.3.2?

Hmm… Ok, now I got it working but ONLY when clicking on both areas and with very specific settings on all components regarding visibility etc. This seems a bit unstable for sure…