Getting Mouse Position During Drag and Drop UMG


I’m following up on this thread:

Just trying to calculate the mouse position from a widget that is being dragged. Currently, I can get the position on the widget, but it is at the anchor point. The top left rather than the mouse that is dragging the widget. I’m hoping to account for that off set.

Here is the set up to get the widget position while being dragged attached.


Do you need the position of where the drag started and finished or the in-between?

Just the finish. I really only need the drop point so I can do a trace to see if an actor is there. Setting the vector 2d on tick is over kill. I’m really only using the last time its set. If that makes and sense.

I think it makes sense; if I understood you right, you want to drag&drop a widget onto a 3d world actor. Kind of tricky.

Generally speaking you cannot track cursor position (in a traditional way) during drag operation because the data is not being sent to the controller. You can only obtain cursor position if you are dragging over another widget in which case pointer events return screenspace location - you’ll need to use that instead:


The bottom line is: as long as your drop operation ends over a widget, you can get the screen location easily.

You’ll have two options here:

  • the actors have a fully transparent widget attached (updating its screen location during drag&drop - so you’ll end up dropping a widget on a 3d object’s widget; you will not even need a trace here)
  • a fully transparent widget covering the entire screen (or at least the gameplay area), like a border with Tint alpha set to 0 (here you’ll need to trace)

You only need to worry about it while dragging so it can be added/removed (or collapsed) to/from the viewport when dragging starts/ends.

Not sure if any other method exists.

Hey thanks,

I finally got the on drop operation to spit out the right vector 2d. I used the fully transparent widget system because I already had that set up. After Get Screen Space I used Absolute to Viewport and pulled off Pixel Position. About the offset of where in the widget you clicked to drag -the simple way to deal with that I found is the pivot enum when you create the Drag Item. I just set it to center center. So everything is going from the center of the widget. This is a good thread as there is some bad information out there about these mouse events on drag and drop.

1 Like

Thanks for posting this workaround! Previously in 4.19 this was not required for drag and dropping out of a scrollbar button widget the touch positions where still passed through to the viewport. I guess Epic ‘fixed’ this behavior as it no longer worked in 4.20. Using your second workaround option I was however able to get drag and dropping from a scroll list item to the viewport working in 4.20 again :slight_smile:

@**aussieburger: A month or so ago I spoke privately with @michalss **who was looking for a way to drag and drop widgets from a panel into the 3d world that’s outside of the widget boundaries (which is pretty normal). I suggested the above method (which works fine, sure).

Now, his response was more interesting, though. We all know one can create a custom drag & drop operation; what I did not know back then was that this operation has its own overrides. I’ve yet to test it out thoroughly enough but I believe this is actually the *correct *way of obtaining cursor position during drag and drop rather than the hacky workaround I found.

Consider looking into the above, it’s probably a more future-proof solution since things may just change in 4.21+

What do you mean exactly by it’s own overrides? Is there any documentation / threads on the custom drag and drop setup?

Thanks btw for giving an update :wink:

Essentially, you can create your own Drag&Drop operation - I’ve seen this covered (and used it) before, including Epic’s official guides. You can extend the class and add data to the operation. But what I mean is this:

This will, apparently, quite happily spit out cursor position independently of the Player Controller (which is receiving squat) even if it’s not traversing over another widget. Admittedly, I’ve yet to take it for a spin myself. If this is common knowledge then I’ve been living under the unreal rock. Never saw this documented or mentioned before and there are countless threads regarding this *issue *on AH.

Looking forward to driving both a widget and world object in a single Drag&Drop operation, and without additional widget helpers:…oBJ50qQmytjIuL

Many thanks for that! Works as expected and much cleaner than using the previous workaround. No idea why it needs to be so complicated & abstract to setup though. Not sure why you can’t retrieve this information from the default Drag and Drop blueprint object. Like some sort of Dragging event binding?

Anyway great that it is working now in a relatively future proof way :slight_smile:

1 Like

Super helpful, thanks! Didn’t think to do this when the drag ate all my mouse position info. Works like a charm. If, like me, the next person is simply looking for what 3D objects are under their mouse cursor while dragging:
1 - Override the Dragged event in your custom Drag&DropOp class (shown in Everynone’s example above)
2 - Pass the PointerEvent that the event Dragged contains off to your player controller or whichever class is handy.
3 - Process the PointerEvent with GetScreenSpacePosition (that gives you X/Y)
4 - Deproject Screen to World (that gives you World position/direction
5 - Use a LineTrace to see whats down there!

Yup, I’ve been using it ever since.

3a. Use the AbsoluteToViewport node to convert the desktop coordinates to pixel position in the screen before the deproject. Otherwise it works fine in fullscreen, but not so much in windows.

Why is this still so cryptic? The link to the example is dead and there is no override for the dragged event in a UMG class?

Because it's undocumented, arcane knowledge obtainable only by those knowing a secret handshake.

The below assumes you have overridden and using a Game Mode, a Player Controller and created a custom Drag & Drop Operation.

  • in the Player Controller

A Custom Event converts screen space to world space via deprojection and sets actor’s position:

  • in the Custom Drag & Drop Operation blueprint

Event Dragged spits out screen space coordinates and feeds them to an Event Dispatcher, we also provide an actor reference (more on that below). We’re using an actor, not an object - the assumption is that you actually want an actor. This Actor Payload variable is flagged as Instance Editable and Exposed on Spawn.

You could use the standard Drag OP Payload Object but you’ll need to cast to actor later on. An actor is more convenient in this instance.

  • in the Widget being dragged:
    – when drag is detected, we spawn an actor
    – create the abovementioned custom Drag Op and feed the actor’s reference into Actor Payload (explained above)
    – up to you how to handle the Default Drag Visual part
    – we then bind the Drag OP’s Dispatcher call with the Player Controller’s Event (first pic)
    – return drag result

Should work in any resolution / DPI / aspect ratio. fingers crossed

There are other methods of handling similar behaviours. The approach would depend on what the end goal is and how it’s supposed to work. If in doubt, do provide the details.


Thank you so much

Hi, sorry for bringing up (again) an old topic, I managed to follow the instructions but I have two issues:

Disable collision on the object you’re dragging and enable it only once dragging is finalised. Otherwise, the line trace will hit the object (here, a chair) that you are dragging. We want to trace against the floor only, for example. Unless you wish to stack a chair on top of another chair, ofc.

Alternatively, look into how custom collision channels work - here the trace can be set up in such a way that certain groups of objects can be easily ignored. This is by far the most efficient method and gives you a lot of control over how tracing behaves.

Perhaps you want to place lamp sconces in areas that make sense only - walls, rather than floors. Collision channels would allow you to ignore certain surfaces:

Placing a wallpaper on a wall blocked by a wardrobe would be possible because dragging wallpapers uses a channel that furniture ignores.

Once I do an action (like jumping or toggling between first and thrid person) I can’t drag the widget anymore

Sounds like the widget has lost focus. Are we in Game and UI Input Mode? Make sure the root of the widget has the isFocusable flag enabled.

Thanks, and how do you do this?
I’ll look up on trace channels.

Static Mesh Components have granular collision settings (and presets) which can also be adjusted dynamically. Perhaps the chair should start with no collision:

And once it’s plopped down, it can be re-enabled.

1 Like