I’ve found that they work if you don’t look at them in the editor. I have a HUD that displays health etc. attached to the camera and it follows my head. I haven’t figured out how to get it to scale properly yet and the crashing is a massive pain, but I’m sure they’ll resolve these issues pretty quickly.
I just tried UMG and Oculus with 4.7 preview 4 and the problem still exists. Is there an ETA for a fix?
We are working on the same thing. Any thoughts?
The resolution to the original issue is now to use 3D Widgets. UMG is projected directly on the camera in its 2D form, which is where the problem is occurring with having two camera “eyes”. After looking into solutions with making the 2D widget appear directly in the eyes, it was determined that the best solution is to use 3D widget that is attached to the camera, but placed slightly away. creates the illusion of the UI floating in from of your player’s eyes, just like a HUD. It turns out that putting anything directly in the eyes of your player is nauseating and gives them a “I have something stuck in my eye” effect, which can cause strain that is painful.
I had nice UI experience showing textures in the classic HUD, is the same pourpose than widget viewport, a 2D overlay. Of course in VR the best way is showing actors in the scene, but for some project a simple data in UI could be enough or valid. So I can’t undestand why unreal users and developers are negative in that point, I’m not engineer but I suppose is possible to check if HUD is active show the widget doubled or something like that. I hope unreal developers are working to get work, in some way.
UMG still isn’t very useful to VR because it attempts to use the system cursor rather than letting blueprints pass in cursor coordinates and click events. It needs to work more like normal blueprints.
There is one issue I have with solution. Since the widget is floating a few feet in front of you, when you walk up to a wall or mesh, the interface will disappear behind that geometry. Is there any elegant fix to this?
I’ve been using UMG just fine with the Oculus Rift for, the stipulation is that you must use a 3D widget component placed into the world - you simply can’t add a UMG widget to the viewport in VR. Also, you must basically code all the key-press interaction yourself.
Adding to the viewport causes the widget to seemingly duplicate itself over each frame… it just gets worse the longer it’s attached to the viewport. Stereo rendering issue, I guess? It wouldn’t be a great solution for VR anyway. Normal functionality returns as soon as you hide the widget or remove it from the viewport.
Dear ,
What follows is to be taken as a positive criticism regarding UMG for VR. Things must improve.
Up to now, as a 20 years programmer, I have been very impressed by the Unreal Engine and everything you have put together with your team. You have made an excellent work for nearly everything… except a few things, including the use of UMG for VR. I have been struggling for the past few days to try and obtain a decent UI with UMG in VR, and even though it is possible to add a UMG interface inside a 3D widget, it feels like a “sticking plaster solution” and there are several issues:
-
Like M.Koutsoubis said, meshes can hide parts of your 3D widget when moving (but can be solved by putting the 3D widget inside the collision box of your character)
-
More troubling, the 3D widget have no option to be rendered on a separate pass, therefore any effect applied to the scene (such as… for instance Depth of Field) would also be applied to the 3D widget (questions/318646/feature-request-3d-widgets-in-separate-render-pass.html). The sticking plaster is then to render the 3D widget transparent, and for god-knows-what-reason, DoF does not affect the widget
-
There are problems on the real orientation/translation of camera in VR mode (questions/243325/how-to-get-real-camera-position-in-vr-mode.html), so placing correctly the 3D widget at anytime in a scene is not straightforward.
-
Like BlackRang said, there are also problems with mouse cursor, although can also be dealt using custom cursor, etc.
And the list goes on… None of that is user friendly, and that’s exactly what happend when you try to solve a problem by putting a sticking plaster (UMG inside 3D widget)… It leaks everywhere and it is painful for those of us who tries not to spend too much time fixing the leaks. I am pretty sure you prefer us creating the coolest ever games/applications rather than lurking on the forums to borrow the sticking plaster of a fellow developer. So for now, even though I find UMG to be a great & impressive tool and it works perfectly in non VR mode, I won’t use it in my applications (compatible both VR / non VR). is bad for me, is bad for you, both fail/fail.
Time is precious. Your team have done a wonderful work with UE4, but if you want to be a AAA 3D engine in the VR world, you absolutely need to have a clean UMG. And yes, making the 2D widget appears directly in front of both eyes is for now the best solution you could provide us. Don’t forget that many UI are temporary, like for menus, dialogues or specific small interactions… and obviously yes, they will appear in front of the eyes, but that’s ok in particular if your 3D scene behind still reacts to head tracking. And by the way, putting the UMG inside the 3D widget just in front of the eyes does exactly the same thing that simply duplicating the 2D widget for both eyes, it is just more complicated and error prone.
Courage and impress us !
My workaround was to create a own 3D-UI with my own BP classes for buttons and other elements. works well if your UI doesn’t get too complicated and it looks really nice in VR as it’s made out of “real” 3D geometry. You can animate the elemnts with timelines and dynamic material instances quite comfortable. It’s a lot of work but the result is way better than using flat UI elements directly in front of your eyes
Best regards,
I second approach, it is what I’ve used for NexusVR umg.
Things I think are missing from UMG (3d):
-well formatted UI blueprints that can handle collision based input for you e.g. motion controller or object collides with surface and forwards input to umg which can automatically handle sub-component collisions for you. is currently the biggest work around bloat, handling custom collision recursively for each UI component.
-better support for inheritance in UMG blueprints in the context of designer
-mouse cursor on plane and focus handling in UMG
I know that we can create workarounds and yes for some interface (even intellectually) it is quite fun to create those dynamical 3D interfaces
But… you’re missing the point
The UE team has developed the UMG so that we can easily create even complex UI without having to spend too much time reinventing the wheel.
It would be a complete nonsense to have developed UMG for that purpose and that it would not be compatible for VR. Don’t forget that many of us are developing applications that are compatible both in VR and non VR mode… and we won’t start creating two different UI for each mode. And yes, some UI can be pretty complex. For instance… If you plan to create mini games inside your big project (ex: mass effect security override) or if you are creating a level up experience graph (like in… well every rpg-like now). Those UI can rapidly become complex and UMG is perfect for that.
Anyhow UMG creates UI. It is its sole purpose. The only thing that we are asking is that it works also in VR mode. That’s it.
And for that, fixing some “simple” issue would help a lot:
questions/238696/error-in-ue-480-preview-4-when-hud-and-vr-preview.html
And then after, it is true that better collision detection with motion controller would be great !
It has definitely worked for us in VR. As long as you’re using 3D widgets and placing them in level in an appropriate position. Then you can fix the mouse to the center of the screen and use it to trigger hover over events, though you have to do part manually. The cool part about UMG widgets is though that they’re able to be put inside one another, so a lot of the hover over logic can be nested inside a button template UMG. In fact, we manage to use the same exact UMG menu for both the regular monitor version of our game and the VR version. We simply engineer all of the input solutions to work simultaneously, and then also determine whether or not to spawn the UMG menu in screen space or in world. One odd thing about 3D widgets though is you every time have to override the origin position to 0,0 from 0.5,0.5. The default just doesn’t work and you must do every time. So yes, actually UMG is a very viable option to use right now for making great VR menus and screen space menus.
Am I right in understanding that you are speaking about UMG inside a Widget 3D that would be placed at a fixed position in the scene and that would not move / orientate depending of the viewer ? Otherwise, your comment doesn’t make sense about fixing the mouse cursor to the center of the screen… And if yes, then we are not speaking about the same issue regarding UMG. But it is true that if you are simply placing your UMG inside the level at a fixed position, it does work. Although If i remember correctly, you may need to have the cursor at the center of the screen otherwise the position of the cursor is wrong (might be corrected in 4.10.1 - hamsterPL claim):
questions/195977/bug-3d-umg-widgets-mouse-position-does-not-transla.html
questions/298261/cannot-interact-with-3d-umg-button.html#answer-318094
questions/257358/3d-widget-mousable-in-screen-space-but-not-world-s.html
The issue that I was describing concerns the use of an UMG inside a Widget 3D that follows the viewer/character. So widget would have to be tied for instance to the character or to the hud. And that’s where everything goes wrong, unless they have fixed the above problems. And note that indeed, for those kind of UMG/Widget 3D, it should work in screen space, and it does not (cf my above post):
questions/238696/error-in-ue-480-preview-4-when-hud-and-vr-preview.html
Yes, that is very cool and that’s why UMG really needs to work perfectly with VR because it has been very nicely developed by UE team.