Download

In-game "Touch Screen" interfaces in VR with Leap Motion and Coherent UI

Hi All,

I implemented a proof of concept for integrating both Coherent UI and Leap Motion to create a virtual “touch screen” type interface in-game.

You can see it in action here:
https://youtube.com/watch?v=3xvFLZI7FYA
You can also check out the source code on github:

https://github.com/lmalave/coherent-ui-virtual-touch

Would welcome any feedback!

Also, as part of this development I developed a raytrace-based input (need the raytrace to select the Coherent UI view that has input focus anyway). Here is the raytrace-based input showing using the HMD as a mouse pointer directly (just use regular mouse or controller buttons for click actions):

Here is an example of interacting with an object that is tens of meters away in virtual space and would be out of reach of hand-based input:

https://www.youtube.com/watch?v=j2rq2uUfXyE

That’s amazing!

[QUOTE=lmalave;238186]
Hi All,

I implemented a proof of concept for integrating both Coherent UI and Leap Motion to create a virtual “touch screen” type interface in-game.

You can see it in action here:
https://youtube.com/watch?v=3xvFLZI7FYA
You can also check out the source code on github:

https://github.com/lmalave/coherent-ui-virtual-touch

Would welcome any feedback!

Also, as part of this development I developed a raytrace-based input (need the raytrace to select the Coherent UI view that has input focus anyway). Here is the raytrace-based input showing using the HMD as a mouse pointer directly (just use regular mouse or controller buttons for click actions):

Here is an example of interacting with an object that is tens of meters away in virtual space and would be out of reach of hand-based input:
https://youtube.com/watch?v=j2rq2uUfXyE[/QUOTE]

And I’m sitting here trying to figure out how to setup an ingame ATM machine… Props to you!

That is amazing, how hard would this be to get working with a 3D UMG widget? Is the interaction generally the same or is that exclusive to Coherent?

Oh that is sweet! I wish there were tactile feedback gloves to work along with that!

Hi pixelvspixel,

Well, the concept would be the same - for any 2D surface you can just transform the Leap finger position to the space of that 2D surface and then just determine when the pointer finger crosses the Z plane.

If you want to take a look at the code I have actually posted on GitHub: GitHub - lmalave/coherent-ui-virtual-touch: This is an implementation of a "virtual touch" interface for Coherent UI in-game browser views

So it’s not specific to Coherent, although the Coherent API is very convenient in that you can just pass the pixel coordinate mouse events directly to the Coherent browser window, so it was pretty easy to implement. So I don’t know anything about UMG widgets, but if they have a similar API then you should be able to implement a similar interaction.

Pixelvspixel, you could use this documentation its a interactive 3d widget tutorial. I’m planing the same type of thing. but what I am planing is to do the register/login and character selection for my vrmmorpg. so it will be a room with a interactive 3d panel in front of the player.

Hi lmalave,

looks great! How do you handle keyboard input in the Coherent UI surfaces inside VR? I try to enable input as described in their guide.

The issue I have is: when I enable the input, the rendered view in the HMD is corrupted. Have you experienced something like that?