Unreal Editor in VR - Research

Hi Everyone,

I am working on the VR Editor at Epic as an intern for my graduation and I need to write a thesis about this. So I need your help for feedback, concepts, ideas, inspirations, opinions and requests. I will keep my eyes on the main VR editor thread, but feel free to post anything you want about the VR editor. I will be posting surveys and other stuff about the VR editor here.

Thanks in advance for your input!

Hi ,

I want MultiEdit for in-editor VR! Even if its not multiple people editing at the same time, just being able to walk through a level with another person and show them stuff would be very nice.

Haha yeah that would be amazing :slight_smile:

Kind of like MakeVR? http://sixense.com/makevr
I’m not sure that ever got released (and aren’t people still waiting for their STEMs?), but the input/control paradigm seems pretty similar!

I Posted this on the main thread also, but here are some of my ideas:

  1. Voice command of what you are looking or pointing at, would be a good addition to direct manipulation with touch controllers. “Scale 150%”, “New light”, “Add material”, “Scale world 50%”, “Add tree mesh”… Seems like this would reduces the amount of pointing, calling up the browser, or difficulty inputing precise input values.

  2. How about some extra tools such as a measuring tape, protractor, 3D guides and construction planes, for measuring out real world scale and layout. I love how Sketchup works, if you want a doorway 8 ft high and 3.5 ft wide its super easy to measure out with the guides first, then build on top of it. Snapping to construction guides is awesome. Most non-CAD 3D software don’t have good measuring or layout tools. I want to work more like a construction engineer in VR.

  3. Would like the ability to place the editor windows in user definable places, for example: 1. on the controller (like it is now) 2. Docked as a HUD in a desired position and size in the users field of view. 3. Docked at desired locations and sizes in the world, so the user could create their own virtual workspace… These three modes seem really interesting, why only use one approach. Also create a very intuitive way to grab the editor screens off the controller and stick it where you want, in the HUD or someplace in the world.

  4. How about taking possession of a character in VR so you can use your own body to put it in a pose or teach it how to move or perform a complex action. Would of course require additional full body motion capture. I just like the idea of doing it in VR. When you jump inside a body you are animating first person, leave the body and you can refine the animation from a third person point of view. I can imagine onion skinning in VR and also manipulating a 3D spline for path based animation. So many interesting things to explore with VR animation.

  5. I think directing Matinee/Sequencer cutscenes in VR would be super cool. You first block out your shots, stage props and actors, set up the lighting and perform your own camera work and acting. Then you perform virtual editing and camera cuts. You could loop your sequence and keep adding layer on top of layer. You might manipulate time in similar ways as world scaling, with gestures.

  6. I really want the ability to walk around my scene and set up all the game logic in VR. For example. I could add a collision volume in a doorway, and when I walk into it, a floating event alert would appear I would then connect this to a Blueprint handler that I add to the scene. when I touch the 3D blueprint hovering in space, it would suddenly expand into a representation of a blueprint graph with 3D plumbing and layout. There is yet, no programming metaphor in VR space that I’m aware of, but seems like blueprints should work. (I think Epic should sponsor a contest to design and illustrate the best and most exciting way to bring blueprints into VR space)

  7. Add some sort of filtering to the controller motion to smooth out shaky hands when the editor screens are visible. The shaking makes them appear unstable or hard to articulate accurate interactions. Although, I’m not sure how the experience feels in VR.

Thank you for your ideas :slight_smile:

  1. Would be awesome to do this. Especially with the limited amount of input/buttons on the controllers. But I think this would be something for the future, definitely something to keep in mind!
  2. This sounds like a really good idea!
  3. Working on this right now.
    4/5/6. We believe that there could be a VR implementation for all kinds of things in UE (animations, sequences, materials etc. ). Right now it is more about what is most important and where the value is for developers.

Hey , thanks for looking at my ideas. I really understand what your saying about creating value for developers. I’m glad you have that kind of principal in place to help guide your priorities. I’ll try to imagine more value based ideas going forward.

  1. One thing I would love to do, the ‘laser pointers’ often look like lightsabers from star wars as you swing them around. Can they be put into destroy mode, where they become shorter like a sword so you can slash anything you want to delete in the scene? The developers behind Fantastic Contraption implemented a fantastic ‘balloon popping with sharp pin’ mechanic to delete items out of the world. Stuff like swinging a lightsaber around to destroy unwanted content just makes the experience more fun and enjoyable.

  2. What about planting meshes like trees or rocks everywhere you point and click to add them. perhaps this gets into world building tools a bit. Would feel better than having to alt drag copies into existence when you have forests to plant.

  3. What if the tip of your laser pointer could spread out into a cone shape with an adjustable radius so you could select multiple objects at a time and then manipulate them as a group. So the laser pointer becomes more of a flashlight with variable falloff. Maybe the green sphere at the tip becomes a hoop shape surrounding the desired objects. Only objects completely contained in the cone would be selected this would help unwanted meshes from being selected. Also you could center your selection on the furthest object to be selected to help define a distance range for the selection.

  4. Would like an easy way to ‘bookmark’ the user’s location/vantage point/world scale to presets, floating markers or voice commands, to jump back and forth from several vantage points while working, so you don’t spend unnecessary time navigating. I found this to be super useful in 3D software like Sketchup, it saves so much time and is also good for creating guided presentations for others that are evaluating your work. There would of course be animated transitions from one vantage point to the next to make it feel natural, like teleporting I suppose.

No problem, discussing ideas like this is really good!

  1. I like your idea of just swinging to delete, but it might be a bit weird to do this with a short range laser pointer. Because then you have to move around a lot and that would make it harder to delete stuff.
  2. Yeah we have also been thinking about how to easily add objects w/o having to drag from the content browser or duplicate every time.
  3. I have 2 problems with this concept. First we noticed it is really weird to see a laser spread out in distance, you loose your sense of scale and distance. The other thing is that I think you would select a lot of objects that you don’t want to select, however this is a general problem in 3d software in my opinion. So should we accept that ? :stuck_out_tongue:
  4. Our artists who tried the editor also want this, so this will probably be implemented.

Btw. You don’t have to post on both threads. Mike and I discuss a lot of this

Yeah, multi-selections are always a problem in 3D, this was my first crack at the idea, but I can imagine how easily it would break down in many situations. Do you have a multi-select method implemented already, something akin to shift selecting multiple items? What about grouping or parenting objects together, is this something you guys have figured out in VR?

Mike implemented multi select. You have to select an object with one hand and then you can select others with your other hand. We have been thinking about grouping and attaching, but not that much.

These are a bit more hypothetical/philosophical:

  1. There’s probably lots that could be done with physical simulation. designing in VR without some kind of physics or constraints might actually impair good design decisions as well as make the environment feel too magical and not grounded. I also think about affordance, the brain’s hard won knowledge about how to interact with the world, sort of goes out the window when immersed in a completely magical space with no physical constraints.

  2. In the future, I hope to see the new generation of AI (deep learning) used to assist the user in manipulating the world in logical ways. For example, currently computers don’t recognize when an object is oriented in illogical or impossible configurations, such as a tree standing upside down or growing perpemdicular out of a wall. The AI could assist the user in correcting these problems. AI might one day lower the learning curve for artists by intelligently assisting them in selecting and using tools, balancing compositions or taking over and performing repetitions tasks. This kind of assistive technology is in heavy development today. VR seems like a perfect place to use this new technology.

Native VR Development

After reading a number of negative comments left on the “Build for VR in VR”](Build for VR in VR - Unreal Engine) blog post, I began to wonder if creating value for game/level designers is the way to go. Perhaps all they want is to streamline their workflows with as little energy, time, money and bugs as possible. Maybe, they don’t want to wave their arms around in VR space for hours and hours. I wonder if editing in VR would be more attractive to less skilled wanna-be content artists, in the same way blueprint authoring was meant to invite artist-non-programmers to develop their own game logic.

It seems Unity believes this way and has been working on their own plans with a standalone Native VR Development Environment, that doesn’t necessarily strive to create value for traditional game developers. I think this is a good discussion topic if anyone want to pipe in. you can read more in the following article:

UpLoadVR Article: http://uploadvr.com/unity-native-vr-announcement-exclusive/

As VR is sort of a new medium I think we’d be crazy not to give it a go. There is a chance it could turn out to make us more effective.
Maybe it is good for some situations and not for others and therefore still be worth it. As others have pointed out, if the tools and workflows get improved with time it has an even bigger chance.

Yeah it depends on a lot of things like the product you are working on and yourself. The guys here at Epic that worked on Bullet Train level really like the idea of not having to switch all the time. They had to tweak stuff, see in VR how it looks, take the HMD off and get back to the PC, remember what they need to change, change it and go back to VR. That is a long iteration for just moving stuff. Personally I don’t really believe that writing code or blueprints in VR is better, but editing animations, particles, materials etc can be more efficient than on a monitor.

But we won’t discover if it is more efficient until we do it.

For duplication, I would imagine this something on the lines of this (non-VR use case, but can be applied to VR specific as well)

Select object(s) to duplicate, press CTRL+SHIFT+C. This brings up a see-through “ghost” of objects at the mouse cursor’s position, and then the user can left click areas to paste them at. Right click the mouse to exit out of “duplication mode”. The reasoning behind using ctrl+shift+c is quite simple… it follows the design model behind ctrl+c to copy something.

I just saw the announcement from Leap Motion regarding their new Orion hand tracking system. Looks so much more accurate and smooth compared to their old system. I wonder if they’ve achieved enough precision to be useful in Unreal’s VR editor as an optional input source. Ultimately, I’d rather use my hands to build in VR as long as its not a frustrating experience. I’m also impressed by their use of ‘Audio Haptics’ in there interaction engine, to help supply the brain with better feedback. Do you currently use some kind of audio feedback in UE4’s VR editor?

Really like the demo ‘Blocks’ that they made:

New vs Old system comparison:

I saw the videos today and I really liked it. We are focussing on the Vive and Oculus and I cannot guarantee what we will support in the future. But it’s definitely something to keep in mind.

We are using sounds for feedback, but we want to have more. Personally I am not sure if devs will want sound, because standing in the office with no vision and no ability to hear makes me feel weird (But that’s my personal view).
The controllers allow us to do vibrating haptic feedback, so that is really good. For example, we use that currently for when you hover over a gizmo handle.

I have some suggestions

1 - Semi-VR compatibility?

There are a lot of VR’s on the market and everybody knows that so why not make a thing that in theory you can set your VR if it does not have yet full support? That would be cool!

2 - Compatibility with cardboard VR and motion controls and
3 - MultiVR compatibility

I think using a google cardboard and a razer hydra at the same time would be a veyr cool thing to do since some people don’t have all that money, or even maybe do in some way that you can use keyboard while using google cardboard!

One thing for certainly I would like is use that room scale feature of vive so can multiple people get in action, maybe one guy with cardboard VR can simulate something and etc. I would still prefer that I can use more than 1 VR headset at time tough

You are right, there are a lot of different HMDs around. I will soon post a survey to ask about which HMD you would prefer to use for VR Editor.

I assume you know about my plugin (MultiEdit), because you asked me for this. Epic doesn’t support that plugin. The feature you describe is pretty hard to achieve, so if I were you I wouldn’t expect it.

I don’t know if we are going to support the hydra, since you can’t really buy them. Our first priorities are the Vive and Oculus, because of their motion controllers and they are supported really well in UE4.

I think that kind of resembles to your question and my answer above? I do think a spectate feature would be awesome! If you are talking about running the VR editor twice or more on one machine, that wouldn’t be do able.

I’ve been playing with the new Leap Motion Orion release for a week. It really does a great job of low-latency UI interaction. I suggest, since it does support Oculus, that you give it a try with the VR editor and see what you can do with it. After all, Oculus Touch isn’t going to be available until 2nd half of 2016. A LOT of new Oculus Rift owners might buy the Leap Motion (like I just did), so they can have a way to interact beyond the gamepads.

Frank Taylor, Raleigh, NC