Announcement

Collapse
No announcement yet.

Unreal Editor in VR - Official megathread

Collapse
This is a sticky topic.
X
X
  • Filter
  • Time
  • Show
Clear All
new posts

    Unreal Editor in VR - Official megathread

    Hey everyone!

    I hope you had a chance to check out our recent Twitch stream where we revealed our latest new project: The Unreal Editor in VR. Tim Sweeney also wrote a nice blog post about it yesterday, and we also made a short video that shows off a few features.

    As Tim said, we'll be taking this to GDC and will have more information on availability then, but meanwhile we're happy to answer most questions here. We'd also love to hear about your own ideas for how we can make this feature truly amazing!

    Where can i get the VR-Editor?

    Currently, its under heavy development. There are currently three ways you can get it.

    1. You can download Unreal Engine 4.12 (currently in preview) from the launcher. No compiling!
    2. Download the latest source code for 4.12 (more stable) ( https://github.com/EpicGames/UnrealEngine/tree/4.12 ). Note: must have have Epic and Github accounts linked like usual.
    3. Preview branch with the latest source code changes for VR-Editor (semi-unstable). https://github.com/EpicGames/UnrealE.../dev-vr-editor . Note: must have have Epic and Github accounts linked like usual.

    How do i enable VR-Editor?

    Currently as of 4.12, it is considered experimental feature as we continue to work on it.

    Enable it by:
    - Go to Edit -> Editor Preferences
    - Click Experimental
    - VR -> Enable VR Editing



    How do I start VR-Editor?

    - Select "VR" on toolbar- Happy Editing!



    --Mike
    Attached Files
    Last edited by KRushin; 05-06-2016, 08:24 AM. Reason: Updated Information for 4.12

    #2
    I don't have any questions, but just wanted to say: This is exactly what I've been hoping for since the moment I first put on my Rift DK2. You guys are awesome! <3
    Broad Strokes | Jan Kaluza | Marketplace Release: 'Over 9000 Swords' Modular Melee Weapon System
    Currently not available for freelance work
    Dev Blog & Tutorials | Twitter

    Comment


      #3
      This is really awesome!
      I think this will facilitate level design for programmers and I suspect a wave of pull requests once this feature goes public.
      Website [ LINK ]
      Twitter [ LINK ]
      Support ! [ LINK ]

      Comment


        #4
        Hey Mike, will developers with a Vive or Touch Devkits get access to it before GDC?
        VR Shooter Guns - Arcade Shooter for Vive
        Unreal Meetup Franken - Unreal Engine 4 Meetup
        Hands for VR: Basic - For Vive and Oculus [Marketplace]
        Hands for VR: SciFi - For HTC Vive and Oculus Touch [Marketplace]

        Comment


          #5
          Hello!

          first I want to thank you for bringing in-editor VR to UE4, its something I have dreamed of since using UE4 the first time with my DK1!

          And of course I have a few questions:

          1. Would this in theory work with the Razer Hydras?

          2. Will this work in the viewport of the BP Editor too?

          3. What about performance when loading assets and stuff like this, does it just hang or will it switch to black during loading?

          4. Any chance we get the BP editor node graph in VR (like in the Pixar movie Wreck-it-Ralph)?

          Thanks in advance.
          Twitter: @Dr4ch

          Comment


            #6
            This looks absolutely fantastic Mike! Most of my simple VR projects up until now were manageable in the standard 2D->HMD->2D workflow, but as the projects grow I am finding that loop for layout and sizing frustrating. This solution is exactly what is needed. I noticed a few minor things worth reviewing, but most of them are pretty obvious and I'm sure they are already in your todo list (like no upside down text!)

            I have one suggestion for experimentation. I find the laser pointer style of selection ergonomically tiring. I notice it when using tilt brush. Tilting your wrists in to point isn't the natural way that we point or select objects in the real world. While we can't actually point yet (waiting for full hand tracking), another method I have been experimenting with is "hammer" or "xylophone" select. Essentially the idea is that your right hand has a hammer or something like a drum stick that you can tap on the left hand slate surface to select. With such a tool the right hand can stay parallel to the forearm and saves you from constantly bending the wrist. Think of the motion you would use to play a xylophone or how a doctor hits a patients knee for a reflex test. Just gentle taps. Although it takes a little bit of acclimation to learn to aim it becomes very natural and is much quicker than a laser pointer when there is a need to hit multiple UI elements in quick succession.

            I've temporarily lost access to my motion controllers, but should I get them back I'll probably make a thread with an example implementation
            VR R&D @Shopify

            Comment


              #7
              Future is near!

              It looks really cool, very high tech and Minority Report like :>

              Although I can't imagine making whole game waving my arms, it would be very tiresome. But if someone making enviroment or game for VR it's amazing.

              I see big potential for applications using this kind of technology.

              I have couple questions:

              1 New gizmo with bounds scale - will this be avalible in normal desktop editor? Espiecially bounds (stretch) scaling. That would be very usefull.

              2 On stream you showed asset browser UI. Is it possible to make own 3D UI (like that asset browser) with UMG and use it in VR mode?
              Game frameworks on Unreal Marketplace:

              Third Person Shooter Kit - February 2017 Community Pick! - https://www.unrealengine.com/marketp...on-shooter-kit

              Side Scroller Shooter Kit - https://www.unrealengine.com/marketp...er-shooter-kit

              Support Discord channel: https://discord.gg/6rgv5Tj

              Comment


                #8
                Hi Everyone,

                I am working on the VR Editor at Epic as an intern for my graduation and I need to write a thesis about this. So I need your help for feedback, concepts, ideas, inspirations, opinions and requests. I will keep my eyes on this thread, but I created a thread for my research. Feel free to post anything about the VR editor there!

                Thanks in advance for your input!

                Comment


                  #9
                  Originally posted by Kashaar View Post
                  I don't have any questions, but just wanted to say: This is exactly what I've been hoping for since the moment I first put on my Rift DK2. You guys are awesome! <3
                  Me too!! Thank you. We know this is just the beginning of professional creative VR applications, but we're getting in on the ground floor so that we can learn as much as possible and share early and often.

                  Comment


                    #10
                    Cross posting from event thread:
                    I didn't notice anything being said about it in the stream, but does this mean we can finally get some proper interaction with 3D widgets? This'd be especially useful, even in non-VR space.

                    Comment


                      #11
                      Originally posted by Mhousse1247 View Post
                      This is really awesome!
                      I think this will facilitate level design for programmers and I suspect a wave of pull requests once this feature goes public.
                      That's what I'm hoping. We're trying to set it up to be extensible so people can try out their own interactions easily in the editor.

                      Comment


                        #12
                        Originally posted by dr4ch View Post
                        Hello!

                        first I want to thank you for bringing in-editor VR to UE4, its something I have dreamed of since using UE4 the first time with my DK1!

                        And of course I have a few questions:

                        1. Would this in theory work with the Razer Hydras?

                        2. Will this work in the viewport of the BP Editor too?

                        3. What about performance when loading assets and stuff like this, does it just hang or will it switch to black during loading?

                        4. Any chance we get the BP editor node graph in VR (like in the Pixar movie Wreck-it-Ralph)?

                        Thanks in advance.
                        Good questions.

                        1. Yes but we haven't tested it with Hydras in a long time. We'll try to support a mode where you can just use motion controllers without an HMD equipped also. For now we're focusing on devices that UE4 supports out of the box.

                        2. We haven't had a chance to try it yet, but we would like to support that.

                        3. If the editor hitches badly, the HMD's compositor will take over so you'll just see a whitebox environment temporarily. We're planning to make more asset loading asynchronous in the editor to avoid that. There's not really a reason we don't async load in the editor except that we never had a strong need for it until now.

                        4. I hope so. It will take awhile before we get to that part. But you can expect the regular Blueprint editor to be accessible while in VR in some form.

                        Comment


                          #13
                          Originally posted by Mike Fricker View Post
                          Good questions.

                          1. Yes but we haven't tested it with Hydras in a long time. We'll try to support a mode where you can just use motion controllers without an HMD equipped also. For now we're focusing on devices that UE4 supports out of the box.
                          I might be able to help with that whenever you guys are ready to share. That said if you're using motion controllers + input mapping from motion controllers it should work without code changes. The only thing you would need to do is to calibrate the hydra base origin, which can either be the parent actor position to both motion controllers or a call to Calibrate in the hydras while maintaining a T-pose.

                          You must already have some offset in your current setup to account for the difference between Touch and the Vive.


                          Questions
                          1. How does the editor handle VR transitions from other experience e.g. Virtual Desktop? Does it properly relinquish rendering to the latest like all other programs do so far?

                          2. How is fine level manipulation of 3d objects handled in general?

                          3. Lightmass baking, does it impact VR judder?

                          4. Can we edit shaders in VR?

                          5. Longer term: Would love to see finger-level interaction instead of button presses, hopefully through some sort of unified architecture


                          Notes
                          1. You guys should involve Opamp77, he was working on making VR editing a reality before your reveal and is awesome in general.

                          2. I second efforts on 3d blueprints and better 3d UMG interaction than what is currently used in umg (e.g. collision based input supported natively)
                          Plugins: Node.js - TensorFlow - Socket.io Client - ZipUtility - Leap Motion - Hydra - Myo - RealSense

                          Comment


                            #14
                            Re Bluepint Editor
                            Originally posted by Mike Fricker View Post
                            Good questions.
                            2. We haven't had a chance to try it yet, but we would like to support that.
                            Make nodes float in 3D! Let us swim with our spaghetti!!

                            Next is VS2015 extension so we can do the same with C++
                            • Follow me on twitter
                            • Visit our website traverse.world
                            • Checkout our game's forum thread

                            Comment


                              #15
                              Originally posted by astonish View Post
                              I have one suggestion for experimentation. I find the laser pointer style of selection ergonomically tiring. I notice it when using tilt brush. Tilting your wrists in to point isn't the natural way that we point or select objects in the real world. While we can't actually point yet (waiting for full hand tracking), another method I have been experimenting with is "hammer" or "xylophone" select. Essentially the idea is that your right hand has a hammer or something like a drum stick that you can tap on the left hand slate surface to select. With such a tool the right hand can stay parallel to the forearm and saves you from constantly bending the wrist. Think of the motion you would use to play a xylophone or how a doctor hits a patients knee for a reflex test. Just gentle taps. Although it takes a little bit of acclimation to learn to aim it becomes very natural and is much quicker than a laser pointer when there is a need to hit multiple UI elements in quick succession.
                              Yannick and I were just chatting about your idea and we really think it's interesting. Besides the ergonomics of holding a controller "like a sword" for long periods of time, we have having to press a trigger to interact with the same hand that you're carefully aiming with decreases accuracy of your clicks. This is the kind of thing I'm excited about developers experimenting with after we make the foundational stuff available.

                              Comment

                              Working...
                              X