Unreal Editor in VR - Official megathread

Hey everyone!

I hope you had a chance to check out our recent Twitch stream where we revealed our latest new project: The Unreal Editor in VR. Sweeney also wrote a nice blog post about it yesterday, and we also made a short video that shows off a few features.

As said, we’ll be taking this to GDC and will have more information on availability then, but meanwhile we’re happy to answer most questions here. We’d also love to hear about your own ideas for how we can make this feature truly amazing!

Where can i get the VR-Editor?

Currently, its under heavy development. There are currently three ways you can get it.

  1. You can download Unreal Engine 4.12 (currently in preview) from the launcher. No compiling!
  2. Download the latest source code for 4.12 (more stable) ( https://github.com/EpicGames/UnrealEngine/tree/4.12 ). Note: must have have Epic and Github accounts linked like usual.
  3. Preview branch with the latest source code changes for VR-Editor (semi-unstable). https://github.com/EpicGames/UnrealEngine/tree/dev-vr-editor . Note: must have have Epic and Github accounts linked like usual.

How do i enable VR-Editor?

Currently as of 4.12, it is considered experimental feature as we continue to work on it.

Enable it by:

  • Go to Edit -> Editor Preferences
  • Click Experimental
  • VR -> Enable VR Editing


How do I start VR-Editor?
[SIZE=2]- Select “VR” on toolbar- Happy Editing!



I don’t have any questions, but just wanted to say: This is exactly what I’ve been hoping for since the moment I first put on my Rift DK2. You guys are awesome! <3

This is really awesome!
I think this will facilitate level design for programmers and I suspect a wave of pull requests once this feature goes public.

Hey , will developers with a Vive or Touch Devkits get access to it before GDC?


first I want to thank you for bringing in-editor VR to UE4, its something I have dreamed of since using UE4 the first time with my DK1!

And of course I have a few questions:

  1. Would this in theory work with the Razer Hydras?

  2. Will this work in the viewport of the BP Editor too?

  3. What about performance when loading assets and stuff like this, does it just hang or will it switch to black during loading?

  4. Any chance we get the BP editor node graph in VR (like in the Pixar movie Wreck-it-Ralph)?

Thanks in advance.

This looks absolutely fantastic ! Most of my simple VR projects up until now were manageable in the standard 2D->HMD->2D workflow, but as the projects grow I am finding that loop for layout and sizing frustrating. This solution is exactly what is needed. I noticed a few minor things worth reviewing, but most of them are pretty obvious and I’m sure they are already in your todo list (like no upside down text!)

I have one suggestion for experimentation. I find the laser pointer style of selection ergonomically tiring. I notice it when using tilt brush. Tilting your wrists in to point isn’t the natural way that we point or select objects in the real world. While we can’t actually point yet (waiting for full hand tracking), another method I have been experimenting with is “hammer” or “xylophone” select. Essentially the idea is that your right hand has a hammer or something like a drum stick that you can tap on the left hand slate surface to select. With such a tool the right hand can stay parallel to the forearm and saves you from constantly bending the wrist. Think of the motion you would use to play a xylophone or how a doctor hits a patients knee for a reflex test. Just gentle taps. Although it takes a little bit of acclimation to learn to aim it becomes very natural and is much quicker than a laser pointer when there is a need to hit multiple UI elements in quick succession.

I’ve temporarily lost access to my motion controllers, but should I get them back I’ll probably make a thread with an example implementation

Future is near! :slight_smile:

It looks really cool, very high tech and Minority Report like :>

Although I can’t imagine making whole game waving my arms, it would be very tiresome. But if someone making enviroment or game for VR it’s amazing.

I see big potential for applications using this kind of technology.

I have couple questions:

1 New gizmo with bounds scale - will this be avalible in normal desktop editor? Espiecially bounds (stretch) scaling. That would be very usefull.

2 On stream you showed asset browser UI. Is it possible to make own 3D UI (like that asset browser) with UMG and use it in VR mode?

Hi Everyone,

I am working on the VR Editor at Epic as an intern for my graduation and I need to write a thesis about this. So I need your help for feedback, concepts, ideas, inspirations, opinions and requests. I will keep my eyes on this thread, but I created a thread for my research. Feel free to post anything about the VR editor there!

Thanks in advance for your input!

Me too!! Thank you. We know this is just the beginning of professional creative VR applications, but we’re getting in on the ground floor so that we can learn as much as possible and share early and often.

Cross posting from event thread:

That’s what I’m hoping. We’re trying to set it up to be extensible so people can try out their own interactions easily in the editor.

Good questions.

  1. Yes but we haven’t tested it with Hydras in a long time. We’ll try to support a mode where you can just use motion controllers without an HMD equipped also. For now we’re focusing on devices that UE4 supports out of the box.

  2. We haven’t had a chance to try it yet, but we would like to support that.

  3. If the editor hitches badly, the HMD’s compositor will take over so you’ll just see a whitebox environment temporarily. We’re planning to make more asset loading asynchronous in the editor to avoid that. There’s not really a reason we don’t async load in the editor except that we never had a strong need for it until now.

  4. I hope so. :slight_smile: It will take awhile before we get to that part. But you can expect the regular Blueprint editor to be accessible while in VR in some form.

I might be able to help with that whenever you guys are ready to share. That said if you’re using motion controllers + input mapping from motion controllers it should work without code changes. The only thing you would need to do is to calibrate the hydra base origin, which can either be the parent actor position to both motion controllers or a call to Calibrate in the hydras while maintaining a T-pose.

You must already have some offset in your current setup to account for the difference between Touch and the Vive.


  1. How does the editor handle VR transitions from other experience e.g. Virtual Desktop? Does it properly relinquish rendering to the latest like all other programs do so far?

  2. How is fine level manipulation of 3d objects handled in general?

  3. Lightmass baking, does it impact VR judder?

  4. Can we edit shaders in VR?

  5. Longer term: Would love to see finger-level interaction instead of button presses, hopefully through some sort of unified architecture :rolleyes:


  1. You guys should involve Opamp77, he was working on making VR editing a reality before your reveal and is awesome in general.

  2. I second efforts on 3d blueprints and better 3d UMG interaction than what is currently used in umg (e.g. collision based input supported natively)

Re Bluepint Editor

Make nodes float in 3D! Let us swim with our spaghetti!!

Next is VS2015 extension so we can do the same with C++

and I were just chatting about your idea and we really think it’s interesting. Besides the ergonomics of holding a controller “like a sword” for long periods of time, we have having to press a trigger to interact with the same hand that you’re carefully aiming with decreases accuracy of your clicks. This is the kind of thing I’m excited about developers experimenting with after we make the foundational stuff available.

Thank you!

Just think of how much better physical shape we’ll all be in! j/k. :slight_smile: You’ll find it is also comfortable to work seated and simply move and the world around to get to everything quickly, you don’t really have to be waving or running around unless you want to.

We want to unify the gizmos but we aren’t sure when. It’s not the right time yet, as we expect to be making a ton of changes to the VR gizmos as we learn about what works best for users.

Yes – it should be easy to do that.

I’m not exactly sure what you mean, but we are making improvements to WidgetComponent to allow for some of the new interactions in VR. Some of the recent improvements (such as proper support for antialiasing of masked widgets) have already been merged into the main branch I believe.

Any devices that work with UE4’s Motion Controller Component should work, as long as you map the buttons and setup the controller artwork. However we currently are expecting an HMD to be used at the same time. We’ll work on that being a separable feature later.

Good question. I haven’t really tried that yet so probably someone on our VR team would know more, as I’m sure it behaves similarly to any other UE4 game in regards to the transition.

You can quickly zoom the world up to make as fine grain transformations as you would like, or you can use the various snapping assets to help. We’re planning on adding some really cool snapping features in the future, but we actually already have a “fancy” VR grid snap that is quite useful.

There is some initially, but we’ll be working on it. Still very early.

We’re planning to allow you to use virtually the entire editor while in VR, with a few exceptions. So yes, that includes the material editor. Initially, some features will be unavailable, but we’ll get to parity over time.

Definitely, me too! We’ve been experimenting with using LEAP controls for certain interactions. I think we’ll get back to that again eventually.

Yes – we’ve been watching his awesome work of course, and already have been in contact with him. :slight_smile:

I’d love to see another movement mode where you use the hand controls to point in a direction, and then can use the pad to fly forward or back (no X-axis bound, turning handled by IRL turning). Some games have implemented this and it seems to work even for people prone to sim sickness.

Also it feels like being Superman, so there’s that.

A swipe style soft-keyboard that uses the touch pads or analog sticks would be really nice too, mainly for doing stuff like using BP context menu.

I’d also like to see a feature where we can enable physics on an object during VR-edit-time and use the motion controls to move it around, for natural placement. Not sure if that’s really feasible though.

Already implemented! (sort of). You can grip the world, and your laser changes color and shape to indicate that you can “teleport”. When you pull the trigger you’ll quickly zoom forward. We showed this during the Twitch stream. You can’t go backwards yet – we didn’t expect that to be useful. We’ll try it out, or maybe even the manual control like you suggested. Thank you.


We’re working on “Simulate” support (whole world toggle), but selective simulation is a bit more tricky, as you might know. While in Simulate, you can of course naturally interact with physical objects, including throwing them around.