For this year’s Leap Motion 3D Jam I’ve teamed up with Mac from the Night Cafe and we’re making NexusVR, a multiplayer portaling system so you don’t have to leave VR!
Here is a first pass WIP after hacking away at UMG to work with VR input
What you’re seeing is me, in VR, hitting an arm menu, which pulls up all the VR experiences that I have available, selecting one, tossing it away, selecting another one and attaching it to the portal gateway infront and using a push gesture to step through.
Just submitted the jam entry, sadly multiplayer got cut due to time constraints, but that means you get the single player nexus all to yourself! Some of the sights of the nexus itself:
How portaling looks like now
And you can browse both using the leap motion (which feels awesome btw) and mouse+keyboard. The search bar is a smart search bar and will redirect non-urls to google search results
If you have a leap motion and an Oculus HMD, grab it at nexusvr.io which redirects to the jam itch page. Just remember to plop all your portals (vr experiences) into the NexusVR/Portals folder and they’ll show up in your portal menu!
The UMG hacks I’ve referred to above are about converting your hand collision with your 3d widget to expected responses in UMG. The end result is a surface about 50cm away from you that you can just touch as it if was a physical touchsensitive surface. This means you can do the things you usually expect of modern tablets such as momentum scrolling and tapping to select. With VR though you can also pass your hand through the screen which you can’t do in the real world, which allows depth based interaction. In my case I use this to convert what you’re looking at into a data cube link you can throw at other screens.
There are a lot of interesting directions to take this. Given people’s interest in the widget stuff, I’ll look into maybe packaging up some blueprints and make a video about it?
Other than that, I think I have a hand count bug in the nexusvr branch of the leap plugin which may cause it to not always detect your hand in FOV, but if you don’t see your hand, just bring it out of view and back in to make it detected. Let me know if that helps
The next focus is to squash bugs and getting multiplayer support in.
Sometimes the main window will get defocused (e.g. mouse clicked outside the window when used with the web browser), which will cut the framerate down to idle, 0.5 second. Just refocus (alt tab or click inside) the window to get back the smooth experience.
Just realized i posted my question in the wrong thread (what a good start in the morning >.<) so here it goes :Wow ,i rly like what u did there sir i always wanted to do a vr hud based on the use of leap motion and a 3d widget ,ill made it after this tutorial http://coherent-labs.com/blog/3d-hol...1-3ds-max-ue4/ however iv never been rly able to interact with it ,could u bring some light into the way u used line traces and collision to interact with them? would be rly thankful ,also is this blui instead of coherent ui?
First you take use ue4’s collision system and get a hit location from the widget (on begin overlap) and your moving object (e.g. hand mesh), at that point I sample the hand’s location and fetch the frontmost(relative to your view) finger, this is the finger that determines the hit location and depth. What you need to do then is to continue sampling the location until you stop colliding. While you’re colliding, take this hit location and translate it into x,y and depth (z in my case) in the widget’s space. This way your collision placement is now in ‘widget local’ position. Pass this touch into the widget component (e.g. a touchedAt(x,y,depth) function).
At this point I have re-usable blueprint user interface widgets that can do custom collision tests, e.g. take a point in 2d space and check if it overlaps a component. With that setup you can then take the position you fed into the collided widget and check against your UI components for collisions, if they do, do your actions. In the case of my ui, when I hover, I move a 2d cursor widget to the intersection position and when I intersect with the surface I cause scrolling in proportion to the movement done by the hand. Then when there is sufficient depth in the collision, I call a separate action (e.g. grab window/link information in the form of a cube).
This may sound a bit complicated, but it’s re-usable so from the end-implementation side it’s pretty easy to extend functionality to other umg widgets you compose. For example all you do is pass-through the touch for each touchable widget and they’ll respond accordingly and I only have to test 2-3 widgets for the composite UI. For my leap entry I only really implemented buttons and scroll boxes, but it wouldn’t take too much to extend these to other types. I also used the same UMG widget touch concept to pass in scrolling data into the BLUI browser surface, which allows you to scroll as if it is a touchable surface in both the browser and the portal list.
In your case you would replace the collision point with a Line Trace Hit, but the rest would work roughly the same.
To make it simpler, I want to release some reusable blueprints on this but I will have to revisit it in January when I have a bit more time.
Ty for taking the time to describe it yes it actually sounds kind of complicated especially the transfer to 2d,rly looking forward to the publication of ur blueprint in january,need it for doom like door interfaces (should work ok with leap since its kind of a giant button not a keypad) and also for the interface ive shown u gonna spend the time until ur release with binding it correctly to ur leap collision hands,call for it if u need something like that interface i think i can translate it to be used with blui since its basicly an html page anyway both plugins should handly it nrly the same way,maybe some actors named different,dunno will have a look into blui anyway