[VIVE] UE VR Editor like UI ?

is there any info or tutorials on creating a gui like the one shown in the UE VR editor?

or perhaps some other tutorials that relate to the creation of something like that, tips, anything will be appreciated.

0ad39eadaac690320c84f356681b37973d2494e2.jpeg

Thanks,
GL

I’m curious about this as well because I need the functionality for using 3D UI widgets with a raycast coming out of the controller like that.

Here’s the closest thing I’ve found,

And in this case, it would involve modifying the VR Editor source code, which I am looking into. It is new, so new means no instructions :stuck_out_tongue:

Something along these lines?

Creating 3D Widgets

Sure, that’s the easy part, I think the part most here are wondering about is being able to set up the pointer so that it will pass mouse events to the 3D widget. Right now you can set up awesome UIs but not really interact with them in VR.

We’ve been using BLUI for all our in-engine VR UIs up to this point, as it allows you to render the UI to geometry, then perform a UV lookup on that geometry. Its not UMG, its HMTL-based, but it does a good job. Its also free, which puts it ahead of the likes of Coherent.

@Crow87- seems like a good option. I just implemented a ton of UMG UI though, doh.

If there were a way to hijack the mouse events and tell them to update based on a raycast’s position logic, it seems like it would work, but I can’t find anything like that in the blueprint side.

I have been looking into getting mouse events to UMG through blueprint, but so far haven’t had the time to really dig into it. I’d rather use UMG instead of HTML (because I know exactly zero HTML stuff), so may have to pick this up again when I get a chance. I’ll post here if I manage to get anything working.

I would just go pull over whatever they do in the VR editor into 4.11 but I just got all my non-source-compatible plugins updated.

Small amount of progress - I now have it doing a raycast, then projecting the hit point and the 3D menu actor’s location onto a plane if it hit the 3D menu object. Then I subtract one from the other to get an offset from the upper left corner, do some scaling magic (because my menu is scaled down inside the blueprint) and set the image-cursor to the correct position on the widget’s tick event.

Then I’m doing some hacky checking of overlaps. Sadly this is where my workaround has hit a brick wall, because you can’t actually get the layout positions of UMG child widgets, so testing using position and size is not straight forward :confused:

So I have been toying around with the engine code to see if I could get a native VR UI working, to no avail. The way UMG/Slate is built up is not extensible at all, I even asked a question here. I got a pretty good answer from a fellow there, but it does not sate me to basically wait for Epic to fix an unheard bug.

So more interesting progress, though no final results, sadly:

Trying to get away from having to hit-test all the widgets myself, I resorted to setting the mouse position in C++ (and BP with a node). I confirmed my offset system works for the 3D widget, but sadly hover and clicking do not work if the mouse position is being forced :confused:

Edit: It seems to actually be an issue with what input mode is set - I want to be in game input only so I can move the camera to move the raycast, but to get mouse over and click events to work it needs to be in game and UI. If I can figure out how to force it not to handle input when it’s set to “Game and UI Mode”, so the character gets input and I can mouselook as normal (and thus move the raycast), it may work.