Also, as part of this development I developed a raytrace-based input (need the raytrace to select the Coherent UI view that has input focus anyway). Here is the raytrace-based input showing using the HMD as a mouse pointer directly (just use regular mouse or controller buttons for click actions):
Here is an example of interacting with an object that is tens of meters away in virtual space and would be out of reach of hand-based input:
Well, the concept would be the same - for any 2D surface you can just transform the Leap finger position to the space of that 2D surface and then just determine when the pointer finger crosses the Z plane.
So it’s not specific to Coherent, although the Coherent API is very convenient in that you can just pass the pixel coordinate mouse events directly to the Coherent browser window, so it was pretty easy to implement. So I don’t know anything about UMG widgets, but if they have a similar API then you should be able to implement a similar interaction.
Pixelvspixel, you could use this documentation its a interactive 3d widget tutorial. I’m planing the same type of thing. but what I am planing is to do the register/login and character selection for my vrmmorpg. so it will be a room with a interactive 3d panel in front of the player.