How to make an interactive desktop in-game using interactive widgets?

And this might be very noob of me to
ask

I’ve yet to see it documented so no worries.

Would the setup for this system be
that I set up the event dispatcher in
a parent widget blueprint. then I
reparent any other widget to the
parent widget, allowing me to
implement the system with different
widgets and have the dispatcher
trigger in the main widget blueprint
(the Monitor Widget for example, that
holds all the other widgets) ?

Precisely, nicely put!

And I achieve that by dispatching a
User Widget so it becomes
type-agnostic in the Event Dispatcher?
(adding the Grabbed Offset as a
dispatcher data as u mentioned as
well)

I was not clear enough about this one, my bad. If there was just a few widgets, I’d probably be ok with some casting. You’d need to manually add a dispatcher to every widget type - painful but doable.

But here’s the version with inheritance and reparenting.


This is my setup:

341975-screenshot-3.png

  • ParentCanvas is where the widgets sit (the monitor thingy)
  • wMyUserWidgetBase is the base class which can dispatch a reference to its own type (and, by extension, all inheriting children):

One important thing here is that wMyUserWidgetBase should not have anything in the hierarchy, I removed the default canvas.

  • the uw1-3 are User Widgets, each reparented to wMyUserWidgetBase, each one is different
  • since they are reparented, they have access to the parent’s dispatcher
  • and the Event Dispatcher’s call can be caught byt the ParentCanvas