I would like a way of creating BSP geometry in VR, I can then use this to export as a mesh and create detailed correctly-scaled buildings and it would be a godsend. I would also like a way to use a traditional mouse and keyboard inside VR in-place of the motion controllers
I Agree, tracking a real physical mouse and keyboard 1:1 to a virtual representation of a mouse and keyboard in VR using lighthouse would be great, so that a user could really be productive with traditional 2D interfaces in VR, such as complex parts of the editor like blueprints. Why do everything with touch controllers? You could simply choose the best input device for any particular task. Tracking a physical keyboard, means that you could see and pick it up and move it around, but know precisely where to peck the keys. With a flat looking representation of a BluePrint graph hovering in front of you, you could reach for the mouse and have precision control over nodes and their connections.
My humble request would be for mobile VR Editor support!
+1
Hello everyone, I hoped you all watched the GDC livestream. I made a survey for the VR Editor to get a better grasp of who is interested in the VR Editor, which hardware you want to use, and what you would like to see in the future.
Thanks in advance for taking the survey and have a nice day!
When can we beta test the VR Editor?
It should be on Github now.
Am very excited about the new developments with the VR editor. Great presentation at GDC. Things I like the most:
-
The virtual workspace being able to position the editor windows around the scene. (Not sure how the material or blueprint editor can be used in VR, except for reference)
-
Love the colored laser pointers that I assume change color depending on which buttons are being pressed. Brilliant!
I can’t wait to test it out! Thanks for all the hard work and putting it in our hands early!
Thank you sir! Yeah colors change depending on your “state” red (orange in the demo map because of post processing) is the default state. Green when moving an object. When you are moving the world you get a yellow laser (this is also to indicate teleporting). And you get a blue scale indicator between your hands when rotating/scaling the world (it’s like a 3D progressbar). It’s is still in really early stages, but giving everyone access to try and give feedback is really important to us
Hi !
I have a feature request which is kinda important in my case for my workstation setup.
I use Virtual desktop (Virtual Desktop on Steam) and have it setup as I would like to use the desktop for regular desktop work with the Vive.
BUT after starting the UE editor it takes over or “allocates” the VR from virtual desktop from start, even though UE doesn’t actually use it from the beginning. UE only uses the Vive when you press the “edit in vr” button or “play in vr” and when you press esc to exit it doesn’t release the Vive but instead show the last framebuffer in a locked state in space (which is useless allocation). It’s only when you quit UE that the Virtual Desktop gets back the control over Vive.
What I wish for is that the editor ONLY allocates and takes control of the vive when the “edit in vr” button is pressed and as soon as the user goes back to desktop by pressing esc UE should release the control over the Vive and let the virtual desktop get back the control. Same for the “play in vr”.
Then I can use the Vive as a virtual workstation AND go into VR editor and also play in VR without ever need to take off the vive or rather not need to go back and forward between regular physical monitors and the Vive.
Lets keep it all virtual
edit: should I have posted this in the official megathread or in a new one instead of here?