Unreal Editor in VR - Research

I Posted this on the main thread also, but here are some of my ideas:

  1. Voice command of what you are looking or pointing at, would be a good addition to direct manipulation with touch controllers. “Scale 150%”, “New light”, “Add material”, “Scale world 50%”, “Add tree mesh”… Seems like this would reduces the amount of pointing, calling up the browser, or difficulty inputing precise input values.

  2. How about some extra tools such as a measuring tape, protractor, 3D guides and construction planes, for measuring out real world scale and layout. I love how Sketchup works, if you want a doorway 8 ft high and 3.5 ft wide its super easy to measure out with the guides first, then build on top of it. Snapping to construction guides is awesome. Most non-CAD 3D software don’t have good measuring or layout tools. I want to work more like a construction engineer in VR.

  3. Would like the ability to place the editor windows in user definable places, for example: 1. on the controller (like it is now) 2. Docked as a HUD in a desired position and size in the users field of view. 3. Docked at desired locations and sizes in the world, so the user could create their own virtual workspace… These three modes seem really interesting, why only use one approach. Also create a very intuitive way to grab the editor screens off the controller and stick it where you want, in the HUD or someplace in the world.

  4. How about taking possession of a character in VR so you can use your own body to put it in a pose or teach it how to move or perform a complex action. Would of course require additional full body motion capture. I just like the idea of doing it in VR. When you jump inside a body you are animating first person, leave the body and you can refine the animation from a third person point of view. I can imagine onion skinning in VR and also manipulating a 3D spline for path based animation. So many interesting things to explore with VR animation.

  5. I think directing Matinee/Sequencer cutscenes in VR would be super cool. You first block out your shots, stage props and actors, set up the lighting and perform your own camera work and acting. Then you perform virtual editing and camera cuts. You could loop your sequence and keep adding layer on top of layer. You might manipulate time in similar ways as world scaling, with gestures.

  6. I really want the ability to walk around my scene and set up all the game logic in VR. For example. I could add a collision volume in a doorway, and when I walk into it, a floating event alert would appear I would then connect this to a Blueprint handler that I add to the scene. when I touch the 3D blueprint hovering in space, it would suddenly expand into a representation of a blueprint graph with 3D plumbing and layout. There is yet, no programming metaphor in VR space that I’m aware of, but seems like blueprints should work. (I think Epic should sponsor a contest to design and illustrate the best and most exciting way to bring blueprints into VR space)

  7. Add some sort of filtering to the controller motion to smooth out shaky hands when the editor screens are visible. The shaking makes them appear unstable or hard to articulate accurate interactions. Although, I’m not sure how the experience feels in VR.