Has anyone implemented a head-look-like-mouse system that interacts with UMG 3D Widget Components like the Oculus Home uses (or the SteamVR controller pointer thing)? They basically show a circle pointer on the widget’s 2D surface when the head-look-pointer is over the widget (and some even fade out the pointer at the edges of the widget’s 2D surface).
Old vr template had some look based interaction, mostly rigged up before UMG with blueprints.
That said if you want to interact with UMG via look, one approach is to make an eye/camera trace (line trace) and pass the hit location in local coordinated to the collided widget and react. This has been the workaround I’ve used so far for VR and UMG.
I took a different route to 3D VR gaze input: I implemented basic gaze-to-actor-with-widget-component and widget-cursor-display in Blueprint, then made a GazeInput plugin using code adapted from SlateApplication’s input processing code. This way the gaze input works with all widgets as-is without the need to add new detection geometry or create new widget types.
I hope to clean it up and release the plugin and instructions in future, unless Epic beats me and releases similar functionality themselves.
Using this you can then hack around the issue to use umg widgets by setting the cursor to the center of the viewport, and tying an input action to a simulated click event:
You’ll also need to grab the hmd position, and update the menu and crosshair position on tick to keep it at the same position in space relative to your character. I’ve opted to simply place the menu inside an actor offset it and update the actor rotation:
After doing this, you can use all of your widgets as they are, without any individual hacks. Just make sure you uncheck “isFocusable” on any buttons, place an invisible button in the background of each base widget to catch any unhandled events and keep it in focus, and use the parabola distortion to make it curve around the character.
Hi Stormwind, your approach to this problem looks very clean, I was thinking about implementing something along these lines, but I don’t want to break open doors. Are you still planning releasing this plugin and blueprints? If yes, do you have any ETA on that? Alternatively would it be possible for you to release this stuff on Github as-is? I promise that I will contribute:)
Thanks
Strike what I said above about updating the menu and crosshair positions on tick, from the twitch broadcast yesterday I learned there is now default behavior for this.
Add a camera to your pawn, check “lock to hmd”, and uncheck “use pawn control rotation”. Parent your crosshair actor to the camera, and add your menu to the pawn as well.
If you want to offset the camera you’ll have to parent it again to another component.
The result is the crosshair stays locked in front of you, and the menu stays locked to the pawn unaffected by the hmd.
I managed to implement successfully a good hmd based vr interaction with umg inside a 3d widget, that’s definitely the fastest way to go right now… I was wondering, that function “Victory simulate key press” of yours (first screen top right) would be very useful, if only I could find that how can I implement that one? it does not appear in my victory functions… for now I just made a basic function that sets as focused the current button based on isHovered state. Any hint regarding how to implement the victory simulate key press?
yeah it figures under my plugin section and I was able to implement other function such as set mouse position, but I can’t find that node exactly. I managed to avoid the issue by setting focus on elements that return true to ishovered, but still, would be much faster if I am simply able to bind a controller button to execute a simulated left click from mouse. Anyway, I was able to setup a functional VR UMG interface with 3d widgets, if there’s interest I may do some tutorials about that, I’ve seen there are many topics around regarding those issues…
@ - It looks like the Simulate Key Press stuff is in the Github code but not in the wiki download.
If you go to the Github, download it as a zip, unzip and then just replace the Source directory with the one from the zip file, you’ll get the nodes after rebuilding.
Edit: What worked for me (in conjuction with the simulate mouse click) was to set the mouse position to the screen projected location of the hit on the UMG widget.
for everyone having issue with VR UMG widget, my actual workflow is the following:
install rama victory plugin -> set mouse cursor to be bound in the center of HMD viewport -> bind controller button press to simulate left click mouse event. this way every umg button interaction should work out of the box. it’s a temporary solution for me, since I am waiting to get a vive for more complex interactions, using DK2 to develop at the moment.
In my case I’m using the HTC Vive controller as the pointer, rather than the headset. So I’m setting the cursor to the projected-to-screen world location from the raycast that goes off of the controller. If you’re using the headset to control the mouse, you’d want to set to the center as mentioned before.
Really trying to make this work with a Vive, but running into some weird issues… hopefully somebody can help.
I’m runing this on the 3d widget actor:
And then this on the vive controller BP on tick:
“Laser target PURE” is the world position of a line trace from the controller, calculated earlier on.
It works when the mouse cursor is on top of the game’s window, but the second the mouse cursor moves away, the collision/mouse over/etc doesn’t work anymore. It looks like the controller’s position is re-normalized into 2D space of both my monitors instead of what’s in the window/viewport. It’s also offset but that doesn’t seem to mind?
Hello guys,
I’m trying to find a way to show a widget on desktop screen only and not in hmd display, is there any way to do that or where i must edit source codes ?