I’m working on a VR menu and I’ve run into a problem. I have multiple widgets, each with its own canvas panel, layered on top of each other. I’m trying to set up a WidgetInteraction to click on buttons, but it seems that the various canvas panels are blocking each other and i cant click on the buttons.
Also, you’re not referencing the comps, this may come back and bite you if you ever want to manipulate them somehow. Seeing how this is an important looking menu (as opposed to a dismissable widget prompt), it might bo worth setting those refs. Or just add comps to the actor which references it automatically.
tl;dr:
I have multiple widgets, each with its own canvas panel, layered on top of each other
Is there a reason for this? What’s shown could be accomplished with a single widget and no canvas. Even if you needed many lists, I’d employ a Widget Switcher instead of juggling multiple widget comps.
Since components are rendered on quads, they must and will block traces. Placing 2 components close enough for them to overlap is something to be actively avoided.
Assuming I understood what is needed, I’d try it like so:
show / hide / move the menu - it’s still part of the actor but it lives in world space which may make sense for VR - again, providing I got the intent right:
Firstly, thank you so much for your detailed response.
I’m currently working on a custom plugin that should be suitable for all future projects. Therefore, the UI needs to be very dynamic. I want to be able to easily construct the UI in the Details Panel with just a few clicks, without having to constantly open the Widget Editor and potentially adjust the Blueprint. That’s why I chose this way where I can select an array in the Details Panel and specify how many buttons I want. Then, in a loop, it creates new widgets for me, one below each other.
In standard mode, without VR, this worked perfectly.
I haven’t found a way to directly create new buttons within the same widget. Is there a method for that?
But using native buttons seems too close to shooting oneself in the foot - these should be user widgets instead. For example:
once clicked, how are you going to identify which button was clicked? Perhaps it’s not important, though.
or, if you decide to change how the buttons behave / look - are you going to inject styles or have a library of pre-made brushes?
or, if you want to have 4 different types of buttons, each with a fancy animation - how would that work?
I’m currently working on a custom plugin that should be suitable for all future projects. Therefore, the UI needs to be very dynamic.
The problem is that your current approach seems to be heading in the direction opposite of modular and scalable. You have 2 unrelated, unreferenced widget comps that do not even know about each other’s existence.
However, I may not be full grasping what is needed since we only see a glimpse and what the plan is, ofc.
Long story short - avoid layering widget components; to make it work, one would need to rewrite the way interaction component traces.
The very first step towards modularity I’d take, would be to look into extending the component itself:
You will end up with a component (can still be created dynamically!) that can not only maintain a life of a widget, but can also host logic and variables, and drive the whole thing:
Now you can define some variables right in the actor this component was added to and have it fully dynamically create a bunch of menus from arrays / classes / data or even pull stuff from DTs.
Step 2 would be looking into Named Slots - but that depends on the scope.