I understand that this may be a simple question, but all videos and resources online I have seen talk about creating animations for elements in a Widget rather than moving them via code/blueprints.
I presently have an image within a Widget that I want to move manually via a Blueprint function. I have promoted the image to a variable, but for the life of me I cannot find how to actually set its relative X, and Y coordinates in relation to the Widget it is present in. Any help is greatly appreciated!
a Widget that I want to move manually via a Blueprint function
You did not specify what manually means; the above is using mouse, for which to work the canvas panel would need setting to Visible since we need something to Mouse Move over.
Hey, thanks for the swift response! I believe you are quite close to what I am looking for, however, I seem to struggle with properly capturing mouse movement in my widget.
Basically, I have a Widget that is projected on a 3D panel to which I want to “forward” all mouse events. I tried the “Set Input Mode UI Only” which according to the official docs should “disable all game controls and UI consumes all input”. However, the overridden onMouseMove function is never invoked.
I also tried setting “Input Mode” to “On” on my parent BP and forwarding the mouse movement via a custom function but that is never invoked either.
Creating actors seems much easier than coding a proper UI
Perhaps I can create a “fake” UI that covers the full screen, captures the mouse movement and then translates it to the smaller 3D screen I actually have in my game. I just wanted to know if there is a better solution to this.
So a widget component in world space mode? Or render target shenanigans are involved? Could you clarify what we’re working with here? Because if you’re using a widget component, the projection is already handled automagically. As in:
This is an actor with a widget component in world space.
Alright, figured things out. I really appreciate your help! For anyone else facing input registration issues, the problem was that my WidgetComponent did not have the “Receive Hardware Input” of the “Interactions” tab to true.
That fixed things for me, greatly appreciate it @Everynone!
There is also an alternative - widget interaction component:
Think of it as of emulation, lets say you have a FPS game / VR title where you point with a gun / non-mouse device (centre of the screen or trace?) onto a keypad and hit E to use it.
You may want the input be interpreted as if it was mouse since widgets do not really understand much else. Call an agnostic custom event: