User Tag List

Page 1 of 2 12 LastLast
Results 1 to 40 of 43

Thread: NexusVR - Connecting the Metaverse

  1. #1
    0

    NexusVR - Connecting the Metaverse

    Hey everyone!

    For this year's Leap Motion 3D Jam I've teamed up with Mac from the Night Cafe and we're making NexusVR, a multiplayer portaling system so you don't have to leave VR!

    Here is a first pass WIP after hacking away at UMG to work with VR input



    What you're seeing is me, in VR, hitting an arm menu, which pulls up all the VR experiences that I have available, selecting one, tossing it away, selecting another one and attaching it to the portal gateway infront and using a push gesture to step through.

    What happens next? Wait for the next update

  2. #2
    0
    More WIP!

    This is what happens when you step through...



    A seamless and quick transition with no waiting. With an endless list to choose from, where will you go?

    GuitarVR experience by @zachkinstner

    How about sharing a video or link?


    Same concept, just turn your data into a virtual object and you can pass it to any receiver such as a big screen.

  3. #3
    0
    Thats awesome!

  4. #4
    0
    This is exactly the kind of thing I've been waiting for

  5. #5
    0
    Just submitted the jam entry, sadly multiplayer got cut due to time constraints, but that means you get the single player nexus all to yourself! Some of the sights of the nexus itself:
















    How portaling looks like now




    And you can browse both using the leap motion (which feels awesome btw) and mouse+keyboard. The search bar is a smart search bar and will redirect non-urls to google search results



    If you have a leap motion and an Oculus HMD, grab it at nexusvr.io which redirects to the jam itch page. Just remember to plop all your portals (vr experiences) into the NexusVR/Portals folder and they'll show up in your portal menu!

    Enjoy!

  6. #6
    0
    Infiltrator
    Join Date
    Mar 2014
    Posts
    11
    Very impressive stuff! Keep up the great work.

  7. #7
    0
    How did you hack UMG widget to appropriately display VR content? I had a questions here:
    https://answers.unrealengine.com/questions/334010/does-umg-widgets-natively-support-vr-mode.html

  8. #8
    0
    Quote Originally Posted by KidBroker View Post
    How did you hack UMG widget to appropriately display VR content? I had a questions here:
    https://answers.unrealengine.com/questions/334010/does-umg-widgets-natively-support-vr-mode.html
    I answered your question and I'll elaborate a bit more here:

    For VR you shouldn't use screen based widgets, instead they should be 3d widgets placed and sized in the world. That way they aren't anchored to your face and you can lean in to see more details. It will also sidestep the need to fix screenspace support for VR. See https://docs.unrealengine.com/latest/INT/Engine/UMG/HowTo/Create3DWidgets/index.html for details. This step requires no hacks.

    The UMG hacks I've referred to above are about converting your hand collision with your 3d widget to expected responses in UMG. The end result is a surface about 50cm away from you that you can just touch as it if was a physical touchsensitive surface. This means you can do the things you usually expect of modern tablets such as momentum scrolling and tapping to select. With VR though you can also pass your hand through the screen which you can't do in the real world, which allows depth based interaction. In my case I use this to convert what you're looking at into a data cube link you can throw at other screens.

    There are a lot of interesting directions to take this. Given people's interest in the widget stuff, I'll look into maybe packaging up some blueprints and make a video about it?

  9. #9
    0

    Awsomeness !

    here are a lot of interesting directions to take this. Given people's interest in the widget stuff, I'll look into maybe packaging up some blueprints and make a video about it?
    Hell yeah Getnamo. No question about it.

    learn so much from all the stuff you give to the community.
    Thank you,

    behram

  10. #10
    0
    Infiltrator
    Join Date
    Jul 2015
    Posts
    17
    Very impressive! VR devices have incredible potentials!

  11. #11
    0
    :O

    Wow! Great stuff man! Looking forward to seeing a repository with this functionality up on GitHub sometime maybe?

  12. #12
    0
    Samaritan
    Join Date
    Jun 2014
    Posts
    109
    Awesome stuff, keep it up
    Anything the mind of man can conceive and believe, it can be achieved. Napoleon Hill

  13. #13
    0
    Supporter
    Join Date
    Oct 2015
    Posts
    4
    Hi, looks good.

    I've tried to run NexusVR but it works not so good as your video.
    Hands are very often not recognised(hidden) or recognised in not best way.
    How did you do it?

  14. #14
    0
    Quote Originally Posted by zumer View Post
    Hi, looks good.

    I've tried to run NexusVR but it works not so good as your video.
    Hands are very often not recognised(hidden) or recognised in not best way.
    How did you do it?
    Make sure you have gone through the basic troubleshooting for leap motion (no bright infrared sources in your FOV, push back slightly to minimize clutter in FOV).

    Other than that, I think I have a hand count bug in the nexusvr branch of the leap plugin which may cause it to not always detect your hand in FOV, but if you don't see your hand, just bring it out of view and back in to make it detected. Let me know if that helps

    The next focus is to squash bugs and getting multiplayer support in.

  15. #15
    0
    Supporter
    Join Date
    Oct 2015
    Posts
    4
    Thanks,
    will try to find solution.

    It looks like leap device is working fine from start.
    But then it starts to lag, maybe it overheated or something like this.

    In any case you've done great work.
    We have a lot of fun when trying to throw portals.

  16. #16
    0
    Quote Originally Posted by zumer View Post
    Thanks,
    will try to find solution.

    It looks like leap device is working fine from start.
    But then it starts to lag, maybe it overheated or something like this.

    In any case you've done great work.
    We have a lot of fun when trying to throw portals.
    Sometimes the main window will get defocused (e.g. mouse clicked outside the window when used with the web browser), which will cut the framerate down to idle, 0.5 second. Just refocus (alt tab or click inside) the window to get back the smooth experience.
    Last edited by getnamo; 11-25-2015 at 06:10 PM.

  17. #17
    0
    Just realized i posted my question in the wrong thread (what a good start in the morning >.<) so here it goes :Wow ,i rly like what u did there sir :P i always wanted to do a vr hud based on the use of leap motion and a 3d widget ,ill made it after this tutorial http://coherent-labs.com/blog/3d-hol...1-3ds-max-ue4/ however iv never been rly able to interact with it ,could u bring some light into the way u used line traces and collision to interact with them? would be rly thankful ,also is this blui instead of coherent ui?

  18. #18
    0
    Quote Originally Posted by jungerroemer View Post
    Just realized i posted my question in the wrong thread (what a good start in the morning >.<) so here it goes :Wow ,i rly like what u did there sir :P i always wanted to do a vr hud based on the use of leap motion and a 3d widget ,ill made it after this tutorial http://coherent-labs.com/blog/3d-hol...1-3ds-max-ue4/ however iv never been rly able to interact with it ,could u bring some light into the way u used line traces and collision to interact with them? would be rly thankful ,also is this blui instead of coherent ui?
    Yep the second menu is using BLUI.

    Regarding how it's been done, here's a breakdown:

    First you take use ue4's collision system and get a hit location from the widget (on begin overlap) and your moving object (e.g. hand mesh), at that point I sample the hand's location and fetch the frontmost(relative to your view) finger, this is the finger that determines the hit location and depth. What you need to do then is to continue sampling the location until you stop colliding. While you're colliding, take this hit location and translate it into x,y and depth (z in my case) in the widget's space. This way your collision placement is now in 'widget local' position. Pass this touch into the widget component (e.g. a touchedAt(x,y,depth) function).

    At this point I have re-usable blueprint user interface widgets that can do custom collision tests, e.g. take a point in 2d space and check if it overlaps a component. With that setup you can then take the position you fed into the collided widget and check against your UI components for collisions, if they do, do your actions. In the case of my ui, when I hover, I move a 2d cursor widget to the intersection position and when I intersect with the surface I cause scrolling in proportion to the movement done by the hand. Then when there is sufficient depth in the collision, I call a separate action (e.g. grab window/link information in the form of a cube).

    This may sound a bit complicated, but it's re-usable so from the end-implementation side it's pretty easy to extend functionality to other umg widgets you compose. For example all you do is pass-through the touch for each touchable widget and they'll respond accordingly and I only have to test 2-3 widgets for the composite UI. For my leap entry I only really implemented buttons and scroll boxes, but it wouldn't take too much to extend these to other types. I also used the same UMG widget touch concept to pass in scrolling data into the BLUI browser surface, which allows you to scroll as if it is a touchable surface in both the browser and the portal list.


    In your case you would replace the collision point with a Line Trace Hit, but the rest would work roughly the same.


    To make it simpler, I want to release some reusable blueprints on this but I will have to revisit it in January when I have a bit more time.

  19. #19
    0
    Ty for taking the time to describe it yes it actually sounds kind of complicated especially the transfer to 2d,rly looking forward to the publication of ur blueprint in january,need it for doom like door interfaces (should work ok with leap since its kind of a giant button not a keypad) and also for the interface ive shown u gonna spend the time until ur release with binding it correctly to ur leap collision hands,call for it if u need something like that interface i think i can translate it to be used with blui since its basicly an html page anyway both plugins should handly it nrly the same way,maybe some actors named different,dunno will have a look into blui anyway

  20. #20
    0
    This looks awesome Getnamo, makes me want a Leap!
    Storyteller - An immersive VR audiobook player

    Dungeon Survival - WIP First person dungeon crawler with a focus on survival and environmental gameplay ala roguelikes

  21. #21
    0
    Champion
    Join Date
    Apr 2014
    Posts
    788
    Nice getnamo.

    I check the web widget for my project and discard because i can't use with controllers/leap motion, nice solution/hack. That problem led me to take another path for communication with web servers avoiding web widget with limitations but with advantages too. Implement a mixed system would be nice for my project. I busy with my project TODO list but after that i add check again the widgets think.
    Last edited by knack; 12-10-2015 at 12:23 PM.
    pd: excuse my english.

    lfw/paid modeling, painting, texturing.

  22. #22
    0
    Samaritan
    Join Date
    Apr 2015
    Posts
    138
    Kickass, any plans on opening this up to the community?

    -> I would love to add to it - build a metaverse using ue!

  23. #23
    0
    Supporter
    Join Date
    Jan 2015
    Posts
    6
    try messing with partial effects,that wound look great by replacing that trowing square

  24. #24
    0
    This is incredible, great work!

  25. #25
    0
    I'm really shocked , this work is incredible .
    Game Designer @ Masked Pharaohs ..
    I'm Providing Arabic Support For Any One ..

  26. #26
    0
    That looks amazing, exactly what I want in VR.

  27. #27
    0
    niceeeee work

  28. #28
    0
    Great work getnamo! and thanks for the tips and info! This is so **** awesome!

  29. #29
    0
    I've gotten almost as far as most of you on this but ive been plagued by this bit
    'take a point in 2d space and check if it overlaps a component. With that setup you can then take the position you fed into the collided widget and check against your UI components for collisions'

    Anyone know how to handle this ?

    to help anyone else out heres how im doing it
    (this is from my character blueprint, and is at the moment going from per-tick but you could do it however you like)
    https://dl.dropboxusercontent.com/u/154440/towidget.jpg , there is a function on the widget that takes the position

  30. #30
    0
    Awesome project! Will you publish it's sources eventually?

  31. #31
    0
    Apologies for the hiatus during the holidays, we're back on track now!

    Latest WIP, added full screen video support for youtube and fullscreen API support if requested by the browser.



    This should make it easy to watch movies or show and share ideas with others near you.

    One of the other big features for the nexus is the convenience to explore various VR experiences without leaving VR and in that context I've made a plugin that binds 7zip + file utilities to UE4. It will be used to allow the nexus to inspect things that you download from the VR web browsers and automatically extract and move them into your Portals if a VR experience is detected.

    More on that when the full system is working!

    Quote Originally Posted by Moss View Post
    Awesome project! Will you publish it's sources eventually?
    Parts and pieces, in general any plugin I build or use for it will get updates to it.

    In that vein I've posted the changes made to BLUI for NexusVR, including updating the CEF library to support for full screen functionality and file download capability.

    I've also released my 7zip plugin for archiving and file manipulation to the community, more on that in the next update.


    There's also the VR UMG and Web surfaces which should have a public version available, but it's not quite ready yet.

    Quote Originally Posted by rrstuv View Post
    Kickass, any plans on opening this up to the community?

    -> I would love to add to it - build a metaverse using ue!
    You can make experiences usable in the metaverse by just making any VR experience (e.g. a UE4 VR program) and the nexus will be able to portal into and out of it. To support better transition, I recommend your experience starts in VR and fades from black at the beginning and fades back to black when the application exits.

    Tighter integration and features are not ready at this moment. If you have specific use cases in mind, let me know so we can think of ways to get those working.

    Quote Originally Posted by PHGtermi View Post
    I've gotten almost as far as most of you on this but ive been plagued by this bit
    'take a point in 2d space and check if it overlaps a component. With that setup you can then take the position you fed into the collided widget and check against your UI components for collisions'

    Anyone know how to handle this ?

    to help anyone else out heres how im doing it
    (this is from my character blueprint, and is at the moment going from per-tick but you could do it however you like)
    https://dl.dropboxusercontent.com/u/154440/towidget.jpg , there is a function on the widget that takes the position
    I would recommend to break a lot of your graph into functions called on sub-classes with specific functionality. In my case I check collisions in the UI widget blueprints by having a function in the base UI widget class that's called IsPointWithinTouchableWidget. I can call this function for all the subwidgets (and they inherit this function from the base class) that should respond to touch input and if the touch is overlapping there I forward input to that sub-widget via a TouchedAtLocation call. Then the sub-widget handles further input and so on until the touch is fully consumed.

    If you compose all your UI widgets this way you can make a lot of the logic re-usable. This makes composing a VR UMG interface very reasonable. You just add e.g. touchable scrollboxes, touchable buttons and lay them out in the designer as usual, add them to your touch check cycle and forward input if they get touched and that's it.
    Last edited by getnamo; 01-24-2016 at 02:38 PM.

  32. #32
    0
    thanks for that getnamo

    the plan was to make it all modular / subclassy when i got the base thing working, but thanks for the thoughts all the same, im moving house this weekend so i wont be able to give it a shot till i get back.

    thanks again,
    Tim

  33. #33
    0
    Luminary
    Join Date
    Mar 2014
    Posts
    1,703
    Hey getnamo -

    Got a few Questions for you about NexusVR -

    1. You mentioned that Nexus VR is supposed to be some sort of portaling system that connects the metaverse. Will it be similar to JanusVR?
    2. Will you be giving people access to Nexus VR so that they can develop their own VR worlds using the API? If so when and how will I be able to get access?
    3. How do you plan on connecting the metaverse?
    4. Can we make non VR experinces with Nexus VR?

  34. #34
    0
    Quote Originally Posted by HeadClot View Post
    Hey getnamo -

    Got a few Questions for you about NexusVR -
    ...
    Great questions let me see if I can't clear up some of the concepts around Nexus VR. Right now there is a singleplayer version of the nexus available at https://getnamo.itch.io/nexusvr which was the leap jam entry. It already allows you to portal from the nexus into any direct to hmd VR experience and back by just dragging your vr game/program folder or a folder with a shortcut (ending with _Custom.ink) into the Portals folder before you launch the nexus.

    The main concept here is that there is no required API in order to be a supported experience; this way Nexus VR can support the whole metaverse and not just a sub-section of it. I generally just recommend you fade in from black straight into VR and fade out to black when you exit from your experience.

    You can for example go into VRChat, pop back out and check on some things using Virtual Desktop and then go to JanusVR staying completely in VR the whole time using natural hand UI, but only you will see the nexus right now because it's singleplayer.

    That's now. What Nexus VR hopes to grow into is a multiplayer experience where one person opens a portal and the nexus automatically syncs that to others stepping through that portal. If you don't have that experience it will pop up a window with a download page for the experience right inside VR. If you tap the download link, the nexus will automatically detect VR experiences in your downloads and shift them into your portal directory and you can then step through. Seamless without restrictions.

    What I want to support is the concept of 'hey let's play some vr volleyball', then one guy throws a portal and you and your friends step through, then the volleyball experience handles the multiplayer and when you're done you just pop out and get back to the nexus, ready to try anything else the metaverse might offer.

    Now with that context in mind, let's address your questions

    Quote Originally Posted by HeadClot View Post
    1. You mentioned that Nexus VR is supposed to be some sort of portaling system that connects the metaverse. Will it be similar to JanusVR?
    It's a different approach than JanusVR. I hope to support some form of scene vr but that is more of a longterm goal, in the short term Nexus VR is about being able to portal into any VR program that is available. This means linking the other social spaces e.g. moving JanusVR to VRChat to AltSpace or Multiplayer VR games.

    Quote Originally Posted by HeadClot View Post
    2. Will you be giving people access to Nexus VR so that they can develop their own VR worlds using the API? If so when and how will I be able to get access?
    You can already build VR worlds right now using Unity or UE4, and if you drag and drop that experience into your portals folder it will be available to portal in/out of. Would it be great if there was a more cohesive API that would e.g. handle the complexity of VR UI, multiplayer, voip, input and avatar syncing for you? yes and that and some form of unreal.js for mini-games are mid-term goals. More on that later

    If you have a specific idea or API you would like to have available in mind, let me know. Feedback and ideas is how we can build the right tools for developers to use!

    Quote Originally Posted by HeadClot View Post
    3. How do you plan on connecting the metaverse?
    It's already connected for a singleplayer experience. The multiplayer version is in the works right now and its release will be the proper launch for Nexus VR.

    The discovery of new vr experiences is important to me and I'm working on making the discovery a single vr experience propagate throughout all the users in the nexus, allowing people to e.g. see what is currently popular, discover new things and share that with others easily.

    Quote Originally Posted by HeadClot View Post
    4. Can we make non VR experinces with Nexus VR?
    Non-VR experience won't work as expected at the moment, but there may be support for non-vr experience sharing in the nexus in the mid-term. I have some prototypes, but they're not ready to show just yet

  35. #35
    1
    You may embed your own javascript text editor in your VR. No 3rd party dependencies required. It is all built with UE4 + V8.

    Name:  SnippetScreenshot.png
Views: 1949
Size:  956.4 KB

  36. #36
    0
    Quote Originally Posted by nako_sung View Post
    You may embed your own javascript text editor in your VR. No 3rd party dependencies required. It is all built with UE4 + V8.

    ...
    Unreal.js is so awesome! Definitely on my roadmap to integrate it into VR, have some cool ideas that have to wait a bit, but stay tuned

  37. #37
    0
    hi getnamo. can you give advice for how to achive it? I mean scroll down and pointable event

  38. #38
    0
    Quote Originally Posted by getnamo View Post
    I would recommend to break a lot of your graph into functions called on sub-classes with specific functionality. In my case I check collisions in the UI widget blueprints by having a function in the base UI widget class that's called IsPointWithinTouchableWidget. I can call this function for all the subwidgets (and they inherit this function from the base class) that should respond to touch input and if the touch is overlapping there I forward input to that sub-widget via a TouchedAtLocation call. Then the sub-widget handles further input and so on until the touch is fully consumed.

    If you compose all your UI widgets this way you can make a lot of the logic re-usable. This makes composing a VR UMG interface very reasonable. You just add e.g. touchable scrollboxes, touchable buttons and lay them out in the designer as usual, add them to your touch check cycle and forward input if they get touched and that's it.
    I took a different route to 3D VR gaze input: I implemented basic gaze-to-actor-with-widget-component and widget-cursor-display in Blueprint, then made a GazeInput plugin using code adapted from SlateApplication's input processing code. This way the gaze input works with all widgets as-is without the need to add new detection geometry or create new widget types.

    I hope to clean it up and release the plugin and instructions in future, unless Epic beats me and releases similar functionality themselves.

  39. #39
    0
    Infiltrator
    Join Date
    Mar 2016
    Posts
    11
    Hey getnamo ! Your work looks great !

    I am also looking to build ui interaction using my hand. I read how you did it but I am kindly new to Unreal Blueprint stuff.
    Do you have any source code or even piece of source code to show to us ?

    Thanks

  40. #40
    0
    Amazing stuff
    My small game on IndieDB ****** Beams on Twitter ****** Beams on Steam ****** VideoStuff ****** PictureStuff
    UE brings Math back into my life or i am not sure.

Page 1 of 2 12 LastLast

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •