NexusVR - Connecting the Metaverse

Nice getnamo.

I check the web widget for my project and discard because i can’t use with controllers/leap motion, nice solution/hack. That problem led me to take another path for communication with web servers avoiding web widget with limitations but with advantages too. Implement a mixed system would be nice for my project. I busy with my project TODO list but after that i add check again the widgets think.

Kickass, any plans on opening this up to the community?

-> I would love to add to it - build a metaverse using ue!

try messing with partial effects,that wound look great by replacing that trowing square

This is incredible, great work!

I’m really shocked , this work is incredible .

That looks amazing, exactly what I want in VR.

niceeeee work

Great work getnamo! and thanks for the tips and info! This is so **** awesome!

I’ve gotten almost as far as most of you on this but ive been plagued by this bit
‘take a point in 2d space and check if it overlaps a component. With that setup you can then take the position you fed into the collided widget and check against your UI components for collisions’

Anyone know how to handle this ?

to help anyone else out heres how im doing it
(this is from my character blueprint, and is at the moment going from per-tick but you could do it however you like)
https://dl.dropboxusercontent.com/u/154440/towidget.jpg , there is a function on the widget that takes the position

Awesome project! Will you publish it’s sources eventually?

Apologies for the hiatus during the holidays, we’re back on track now!

Latest WIP, added full screen video support for youtube and fullscreen API support if requested by the browser.


This should make it easy to watch movies or show and share ideas with others near you.

One of the other big features for the nexus is the convenience to explore various VR experiences without leaving VR and in that context I’ve made a plugin that binds 7zip + file utilities to UE4. It will be used to allow the nexus to inspect things that you download from the VR web browsers and automatically extract and move them into your Portals if a VR experience is detected.

More on that when the full system is working!

Parts and pieces, in general any plugin I build or use for it will get updates to it.

In that vein I’ve posted the changes made to BLUI for NexusVR, including updating the CEF library to support for full screen functionality and file download capability.

I’ve also released my 7zip plugin for archiving and file manipulation to the community, more on that in the next update.

There’s also the VR UMG and Web surfaces which should have a public version available, but it’s not quite ready yet.

You can make experiences usable in the metaverse by just making any VR experience (e.g. a UE4 VR program) and the nexus will be able to portal into and out of it. To support better transition, I recommend your experience starts in VR and fades from black at the beginning and fades back to black when the application exits.

Tighter integration and features are not ready at this moment. If you have specific use cases in mind, let me know so we can think of ways to get those working.

I would recommend to break a lot of your graph into functions called on sub-classes with specific functionality. In my case I check collisions in the UI widget blueprints by having a function in the base UI widget class that’s called IsPointWithinTouchableWidget. I can call this function for all the subwidgets (and they inherit this function from the base class) that should respond to touch input and if the touch is overlapping there I forward input to that sub-widget via a TouchedAtLocation call. Then the sub-widget handles further input and so on until the touch is fully consumed.

If you compose all your UI widgets this way you can make a lot of the logic re-usable. This makes composing a VR UMG interface very reasonable. You just add e.g. touchable scrollboxes, touchable buttons and lay them out in the designer as usual, add them to your touch check cycle and forward input if they get touched and that’s it.

thanks for that getnamo

the plan was to make it all modular / subclassy when i got the base thing working, but thanks for the thoughts all the same, im moving house this weekend so i wont be able to give it a shot till i get back.

thanks again,
Tim

Hey getnamo -

Got a few Questions for you about NexusVR -

  1. You mentioned that Nexus VR is supposed to be some sort of portaling system that connects the metaverse. Will it be similar to JanusVR?
  2. Will you be giving people access to Nexus VR so that they can develop their own VR worlds using the API? If so when and how will I be able to get access?
  3. How do you plan on connecting the metaverse?
  4. Can we make non VR experinces with Nexus VR?

Great questions let me see if I can’t clear up some of the concepts around Nexus VR. Right now there is a singleplayer version of the nexus available at NexusVR by getnamo which was the leap jam entry. It already allows you to portal from the nexus into any direct to hmd VR experience and back by just dragging your vr game/program folder or a folder with a shortcut (ending with _Custom.ink) into the Portals folder before you launch the nexus.

The main concept here is that there is no required API in order to be a supported experience; this way Nexus VR can support the whole metaverse and not just a sub-section of it. I generally just recommend you fade in from black straight into VR and fade out to black when you exit from your experience.

You can for example go into VRChat, pop back out and check on some things using Virtual Desktop and then go to JanusVR staying completely in VR the whole time using natural hand UI, but only you will see the nexus right now because it’s singleplayer.

That’s now. What Nexus VR hopes to grow into is a multiplayer experience where one person opens a portal and the nexus automatically syncs that to others stepping through that portal. If you don’t have that experience it will pop up a window with a download page for the experience right inside VR. If you tap the download link, the nexus will automatically detect VR experiences in your downloads and shift them into your portal directory and you can then step through. Seamless without restrictions.

What I want to support is the concept of ‘hey let’s play some vr volleyball’, then one guy throws a portal and you and your friends step through, then the volleyball experience handles the multiplayer and when you’re done you just pop out and get back to the nexus, ready to try anything else the metaverse might offer.

Now with that context in mind, let’s address your questions

It’s a different approach than JanusVR. I hope to support some form of scene vr but that is more of a longterm goal, in the short term Nexus VR is about being able to portal into any VR program that is available. This means linking the other social spaces e.g. moving JanusVR to VRChat to AltSpace or Multiplayer VR games.

You can already build VR worlds right now using Unity or UE4, and if you drag and drop that experience into your portals folder it will be available to portal in/out of. Would it be great if there was a more cohesive API that would e.g. handle the complexity of VR UI, multiplayer, voip, input and avatar syncing for you? yes and that and some form of unreal.js for mini-games are mid-term goals. More on that later :slight_smile:

If you have a specific idea or API you would like to have available in mind, let me know. Feedback and ideas is how we can build the right tools for developers to use!

It’s already connected for a singleplayer experience. The multiplayer version is in the works right now and its release will be the proper launch for Nexus VR.

The discovery of new vr experiences is important to me and I’m working on making the discovery a single vr experience propagate throughout all the users in the nexus, allowing people to e.g. see what is currently popular, discover new things and share that with others easily.

Non-VR experience won’t work as expected at the moment, but there may be support for non-vr experience sharing in the nexus in the mid-term. I have some prototypes, but they’re not ready to show just yet :wink:

You may embed your own javascript text editor in your VR. :slight_smile: No 3rd party dependencies required. It is all built with UE4 + V8.

SnippetScreenshot.jpg

Unreal.js is so awesome! Definitely on my roadmap to integrate it into VR, have some cool ideas that have to wait a bit, but stay tuned :slight_smile:

hi getnamo. can you give advice for how to achive

it? I mean scroll down and pointable event

I took a different route to 3D VR gaze input: I implemented basic gaze-to-actor-with-widget-component and widget-cursor-display in Blueprint, then made a GazeInput plugin using code adapted from SlateApplication’s input processing code. This way the gaze input works with all widgets as-is without the need to add new detection geometry or create new widget types.

I hope to clean it up and release the plugin and instructions in future, unless Epic beats me and releases similar functionality themselves.

Hey getnamo ! Your work looks great !

I am also looking to build ui interaction using my hand. I read how you did it but I am kindly new to Unreal Blueprint stuff.
Do you have any source code or even piece of source code to show to us ?

Thanks :slight_smile:

Amazing stuff :slight_smile: