On a UMG Button, how to handle DragDrop functionality while retaining OnClicked functionality?

Using this syntax:
Widget > Child | Sibling

I have a widget class to act as a CustomButton that looks like this:

Overlay > Border | Button

I am using these inside another widget (I’ll call it Grid) that looks like this:

VerticalBox > ScaleBox > UniformGridPanel > CustomButton | CustomButton | CustomButton

In the CustomButton, I set the OnClicked, OnHovered, and OnUnhovered behavior and it worked great.

Then, I wanted to be able to add drag and drop functionality, so I defined OnMouseButtonDown and OnDragDetected in the CustomButton, and OnDrop in the Grid. (This functionality doesn’t actually visually/physically drag and drop any buttons around, it merely links them together in the underlying DataAsset.)

But it wasn’t working. I figured out that if I disable the Button (in CustomButton), the drag operations worked, but obviously, the click and hover operations didn’t. Then I figured out that if I override OnPreviewMouseButtonDown instead of OnMouseButtonDown, the dragdrop operations worked, and the hover also worked, but the OnClicked wasn’t firing.

So after reading about OnPreviewMouseButtonDown, I learn about input events and how they “bubble” up the widget chain or, if using the Preview version, “tunnel” down the chain. So, it seems either the CustomButton can consume the click and let drags happen, or the Button can consume the click and let OnClicked happen.

How can I get both to work?

I thought I could use OnPreviewMouseButton down, simulate my own OnClicked by setting a boolean in CustomButton called bIsMouseButtonDown, setting that variable in OnMouseButtonDown, unsetting it in MouseLeave and DragLeave, and checking for it in OnMouseButtonUp, and if it was true, then call ClickEvent (a custom event that I’d essentially just turn OnClicked into). But this wouldn’t work with OnDrop, because then OnButtonUp would consume the mouse up event and the OnDrop wouldn’t get it.

I tried disconnecting the EventReply from the DetectDragOnPressed node and instead, hooking up an Unhandled node to the Return node, but that didn’t work.

^ This could be construed as simply trying to figure out how to handle one input with multiple widgets; i.e. let the input keep bubbling up the chain instead of being consumed.

I also tried, after reading that doing drag and drop on Buttons is not really viable, to turn the Button into a Border instead. But I couldn’t find a way to register OnClicked for a Border widget.

So, either an answer to that would solve the problem, or just in general, the title of this post.

Thank you.

So, it seems either the CustomButton
can consume the click and let drags
happen, or the Button can consume the
click and let OnClicked happen.

I also tried, after reading that doing drag and drop on Buttons is not really viable

Not recommended but can be done, have a look at Precise Click - it’s in the button, an advanced hidden panel:

311461-annotation-2020-08-26-212025.jpg

It may or may not solve your issue.

turn the Button into a Border instead.
But I couldn’t find a way to register
OnClicked for a Border widget

It’s not a bad approach providing you’re willing to work in fancy states, sound handling and whatnot else the snazzy buttons have. Border events can be overriden like so:

311462-annotation-2020-08-26-212359.jpg

You want to capture border’s attention first when mouse buttons goes down:

And then Up:

As a bonus you get to handle Double Click - something a button is not that good at…

1 Like

Setting up a widget to detect drag-and-drop as well as click is a bit tricky. I had the darndest time getting it to work with a UMG button, but got it to work very reliably by using a Border and binding to its OnMouseButtonDown/OnMouseButtonUp. Then, manually detecting whether this is a click in the OnMouseButtonUp or whether it’s a drag gesture in Event Tick. The screenshots show how you can do it, your user widget needs a Pressed and Dragging boolean var.

“On Click” and “On Drag Start/Stop” are just custom event nodes on the main event graph here. You can hook up your game logic to it. Hope this is helpful to you, I found it to work very reliable for my UI for dragging actors into the world.

Also, here’s a use scenario for a border drag:

Thank you both for your quick replies. I ended up using something very similar to Zhi Kang Shao’s solution, but I read both answers and incorporated both of your input into mine. This is what worked for me.

The HUD/Grid widget, which is basically just a 12x12 UniformGridPanel of custom buttons:

311612-ue-forum-answer-grid.png

The UniformGridPanel had to be set to Visible in order for OnDrop to work. Also, the custom buttons are all Visible. Everything else is “Not Hit-Testable (Self Only)”.

The custom Button widget:

311613-ue-forum-answer-custom-button.png

Both the “Button” and “ButtonAppearance” widgets are actually Borders and not Buttons. “Button” is invisible and fills the entire available space so that it can accept mouse inputs. Also, note that it is listed as the last of the Overlay children so that it is “on top” of the ButtonAppearance and the inputs will all be received without getting blocked. Everything is set to Visible, though Button has its Render Opacity set to 0.0 to be invisible when rendered.

NOTE: I initially encountered a problem getting the buttons to render at all after I had changed them to both be Border widgets. It turns out that it was the Content Padding. If they are both set to 0.0 then nothing will render. One or the other needs a positive value in order for them to render properly. I used a value of 0.5 for the Button and kept the ButtonAppearance’s Content Padding at 0.0.

Inside the Custom Button widget, I needed a boolean variable “bIsPressed” for when the left mouse button had been pushed down (but not yet released) on the button, and also has not left the button’s area. It defaults to false, is set to true in OnMouseButtonDown, and is checked in OnMouseButtonUp, and if true, then a CustomOnClicked event is fired which takes the place of the regular event OnClicked. It is set to false in OnMouseEnter, OnMouseLeave, OnDragEnter, and OnDragLeave. All four of those imply that, since the mouse button was first pushed down, the cursor has since left the button’s area, and therefore will not count as a Click.

However, I did not need a “bIsDragging” variable like Zhi Kang Shao did, for whatever reasons. All I did was create a new BP class from DragDropOperation, which contained two variable references to my custom button class as SourceNode and DestinationNode. I created an instance of this operation from OnDragDetected:

NOTE: I had promoted the DraggedWidget (which doesn’t work yet, but… irrelevant) to a variable so that in OnDragCancelled/OnDrop I could destroy it. However, I was surprised to learn that you can’t destroy widgets, but only remove them from their parent widgets, and let the GC take care of it. So… I’m not sure if that is necessary; probably not.

OnDrop was the only mouse event that I didn’t handle inside the custom button, but rather in the HUD/Grid, where I just cast the operation input pin to my custom operation and did what I needed to with the data.

I could no longer using the OnHovered and OnUnhovered events for the button, since they don’t exist for a Border, but OnMouseEnter and OnMouseLeave do exist, and they served the exact same purpose.

Thank you again to both of you!

2 Likes

I managed to do it by writing some custom C++ that combines the behavior of UButton and SButton with UserWidget. I add this Touch Target Widget into the hierarchy of my widget in place of a Button Widget and it handles clicks, drags, etc…

If you use the precise tap mode of Button, and forward drag events to a UUserWidget that returns a drag drop operation, it seems to work really well.

This little code snippet is a modification of some code in OnMouseButtonDown taken from SButton.cpp

                                else if (InputClickMethod == EButtonClickMethod::PreciseClick)
				{
                                        // Basically here I'm checking if I want to support drag detection in my SWidget that copies a lof of SButton code
					if (InteractionState->bDetectsDrag)
					{
						// We need to detect drag for this tap or click if it's set to
						LastPointerDragDetection.Add(MouseEvent.GetPointerIndex(), EffectingButton);
                                                // EffectingButton can just bey EKeys::LeftMouseButton, but I support arbitrary keys
						return FReply::Handled().DetectDrag( AsShared(), EffectingButton);
					}
					
					// do not capture the pointer for precise taps or clicks
					// 
					return FReply::Handled();
				}

Here I call from my Slate widget into a UMG widget that then forwards the event onto a UUserWidget of your choice to return a drag drop operation.

FReply SRDBaseInteractionTargetWidget::OnDragDetected(const FGeometry& MyGeometry, const FPointerEvent& MouseEvent)
{
	FKey DragKey = LastPointerDragDetection[MouseEvent.GetPointerIndex()];
	
	// This is a slight hack, but unreal's drag drop behavior of using the current mouse position instead of the starting mouse position feels janky
	// This forces the mouse event to think it's at the starting position when you began the drag, so you get good feeling drag drop behavior out of the box for free
	// at the expense of possibly breaking things since info about the current cursor positon is lost
	FVector2D PressedPos = InteractionKeyStates[DragKey].PressedScreenSpacePosition;
	
	FPointerEvent ModifiedPointerEvent = FPointerEvent(MouseEvent, PressedPos,
		PressedPos + (MouseEvent.GetScreenSpacePosition() - MouseEvent.GetLastScreenSpacePosition()));
	
	FReply Reply = ExecuteOnDragDetected(MyGeometry, ModifiedPointerEvent);

	if (Reply.IsEventHandled())
	{
		ReleaseAll();
	}

	return Reply;
}