I hope you had a chance to check out our recent Twitch stream where we revealed our latest new project: The Unreal Editor in VR. Sweeney also wrote a nice blog post about it yesterday, and we also made a short video that shows off a few features.
As said, we’ll be taking this to GDC and will have more information on availability then, but meanwhile we’re happy to answer most questions here. We’d also love to hear about your own ideas for how we can make this feature truly amazing!
Where can i get the VR-Editor?
Currently, its under heavy development. There are currently three ways you can get it.
You can download Unreal Engine 4.12 (currently in preview) from the launcher. No compiling!
I don’t have any questions, but just wanted to say: This is exactly what I’ve been hoping for since the moment I first put on my Rift DK2. You guys are awesome! <3
This looks absolutely fantastic ! Most of my simple VR projects up until now were manageable in the standard 2D->HMD->2D workflow, but as the projects grow I am finding that loop for layout and sizing frustrating. This solution is exactly what is needed. I noticed a few minor things worth reviewing, but most of them are pretty obvious and I’m sure they are already in your todo list (like no upside down text!)
I have one suggestion for experimentation. I find the laser pointer style of selection ergonomically tiring. I notice it when using tilt brush. Tilting your wrists in to point isn’t the natural way that we point or select objects in the real world. While we can’t actually point yet (waiting for full hand tracking), another method I have been experimenting with is “hammer” or “xylophone” select. Essentially the idea is that your right hand has a hammer or something like a drum stick that you can tap on the left hand slate surface to select. With such a tool the right hand can stay parallel to the forearm and saves you from constantly bending the wrist. Think of the motion you would use to play a xylophone or how a doctor hits a patients knee for a reflex test. Just gentle taps. Although it takes a little bit of acclimation to learn to aim it becomes very natural and is much quicker than a laser pointer when there is a need to hit multiple UI elements in quick succession.
I’ve temporarily lost access to my motion controllers, but should I get them back I’ll probably make a thread with an example implementation
I am working on the VR Editor at Epic as an intern for my graduation and I need to write a thesis about this. So I need your help for feedback, concepts, ideas, inspirations, opinions and requests. I will keep my eyes on this thread, but I created a thread for my research. Feel free to post anything about the VR editor there!
Me too!! Thank you. We know this is just the beginning of professional creative VR applications, but we’re getting in on the ground floor so that we can learn as much as possible and share early and often.
Yes but we haven’t tested it with Hydras in a long time. We’ll try to support a mode where you can just use motion controllers without an HMD equipped also. For now we’re focusing on devices that UE4 supports out of the box.
We haven’t had a chance to try it yet, but we would like to support that.
If the editor hitches badly, the HMD’s compositor will take over so you’ll just see a whitebox environment temporarily. We’re planning to make more asset loading asynchronous in the editor to avoid that. There’s not really a reason we don’t async load in the editor except that we never had a strong need for it until now.
I hope so. It will take awhile before we get to that part. But you can expect the regular Blueprint editor to be accessible while in VR in some form.
I might be able to help with that whenever you guys are ready to share. That said if you’re using motion controllers + input mapping from motion controllers it should work without code changes. The only thing you would need to do is to calibrate the hydra base origin, which can either be the parent actor position to both motion controllers or a call to Calibrate in the hydras while maintaining a T-pose.
You must already have some offset in your current setup to account for the difference between Touch and the Vive.
Questions
How does the editor handle VR transitions from other experience e.g. Virtual Desktop? Does it properly relinquish rendering to the latest like all other programs do so far?
How is fine level manipulation of 3d objects handled in general?
Lightmass baking, does it impact VR judder?
Can we edit shaders in VR?
Longer term: Would love to see finger-level interaction instead of button presses, hopefully through some sort of unified architecture :rolleyes:
Notes
You guys should involve Opamp77, he was working on making VR editing a reality before your reveal and is awesome in general.
I second efforts on 3d blueprints and better 3d UMG interaction than what is currently used in umg (e.g. collision based input supported natively)
and I were just chatting about your idea and we really think it’s interesting. Besides the ergonomics of holding a controller “like a sword” for long periods of time, we have having to press a trigger to interact with the same hand that you’re carefully aiming with decreases accuracy of your clicks. This is the kind of thing I’m excited about developers experimenting with after we make the foundational stuff available.
Just think of how much better physical shape we’ll all be in! j/k. You’ll find it is also comfortable to work seated and simply move and the world around to get to everything quickly, you don’t really have to be waving or running around unless you want to.
We want to unify the gizmos but we aren’t sure when. It’s not the right time yet, as we expect to be making a ton of changes to the VR gizmos as we learn about what works best for users.
I’m not exactly sure what you mean, but we are making improvements to WidgetComponent to allow for some of the new interactions in VR. Some of the recent improvements (such as proper support for antialiasing of masked widgets) have already been merged into the main branch I believe.
Any devices that work with UE4’s Motion Controller Component should work, as long as you map the buttons and setup the controller artwork. However we currently are expecting an HMD to be used at the same time. We’ll work on that being a separable feature later.
Good question. I haven’t really tried that yet so probably someone on our VR team would know more, as I’m sure it behaves similarly to any other UE4 game in regards to the transition.
You can quickly zoom the world up to make as fine grain transformations as you would like, or you can use the various snapping assets to help. We’re planning on adding some really cool snapping features in the future, but we actually already have a “fancy” VR grid snap that is quite useful.
There is some initially, but we’ll be working on it. Still very early.
We’re planning to allow you to use virtually the entire editor while in VR, with a few exceptions. So yes, that includes the material editor. Initially, some features will be unavailable, but we’ll get to parity over time.
Definitely, me too! We’ve been experimenting with using LEAP controls for certain interactions. I think we’ll get back to that again eventually.
Yes – we’ve been watching his awesome work of course, and already have been in contact with him.
I’d love to see another movement mode where you use the hand controls to point in a direction, and then can use the pad to fly forward or back (no X-axis bound, turning handled by IRL turning). Some games have implemented this and it seems to work even for people prone to sim sickness.
Also it feels like being Superman, so there’s that.
A swipe style soft-keyboard that uses the touch pads or analog sticks would be really nice too, mainly for doing stuff like using BP context menu.
I’d also like to see a feature where we can enable physics on an object during VR-edit-time and use the motion controls to move it around, for natural placement. Not sure if that’s really feasible though.
Already implemented! (sort of). You can grip the world, and your laser changes color and shape to indicate that you can “teleport”. When you pull the trigger you’ll quickly zoom forward. We showed this during the Twitch stream. You can’t go backwards yet – we didn’t expect that to be useful. We’ll try it out, or maybe even the manual control like you suggested. Thank you.
Definitely.
We’re working on “Simulate” support (whole world toggle), but selective simulation is a bit more tricky, as you might know. While in Simulate, you can of course naturally interact with physical objects, including throwing them around.