VR UI Screen Space on PC


I am new to Unreal Engine after several years with Unity and I work on VR projects with Oculus or Vive devices.

Now I want to do a simple thing (in Unity) but I don’t find any tutorials, workflows on UE4 to get the same result or it seems to be very tricky xd.
I need to have an UI in screen space on my PC screen while the VR user plays:

  • the VR user don’t see these UIs
  • this is only for the external public
  • it is on the same windows as the VR app/games

These UIs have the purpose to display some information at runtime such as score, life, timer, menu, etc.
In recent game, it is the same as Half Life Alyx.

On Unity, I have to switch my Canvas component to Screen Space, it is the default value.

On UE4, I tested :

  • the blueprint node “Add to Viewport”
  • the widget component inside a blueprint
    All work normally when the VR is not enabled but all disappear in the other case.

Do you have any tips, workflows or other things to get this result ?

Thank you for your help =)

You could investigate using this plugin to do what you want…

Yes in UE4 it’s not as straight forward as it sounds to be in Unity at least as far as I know. What I had to do in the past to get a widget to display on the screen while not having anything in VR rendered is using the SetSpectorScreenTexture node. What you see in the screen while playing VR is actually some sort of spectator mode, using that node you can draw a texture on it. You then have to set up the spectator mode to be texture plus eye. The texture you draw on the screen can then come from a widget you can have in world space using the widget component. Using a widget reference there is a node that lets you get the resulting texture a widget generates.

I will try to see if I can add this as an example in the VRContentExamples project as it’s actually not too complex to setup but definitely not straight forward.

Ok thank you guys for the help.

Too bad that is not simple as Unity xd

@**DownToCode, **It could be great to have an example in your VRContentExamples =D

Now I know that the keys are on the “Spectator” nodes and the render target.
Then, I need to understand how I can combine the widgets with the render target.
I will check later.

If someone is interested, I found a great topic by Epic on the forum

[USER=“1395059”]Rangers Johan[/USER] I just updated the VR Content Examples with an example of what you were after. If you go to “Content/VRContentExamples/VRScreenUI” you will find an example of how you can setup a VRPawn which passes some info to a widget rendered to the screen.

A couple of things to watch out for since the setup isn’t as straight forward as it should be:

  • The VR Pawn tick interval has been reduced to only tick every 0.1 seconds instead of every frame. A comment on the tick explains where you can find the property to change it.
  • Since the Widget has to be in the world as a 3D widget to give back a valid render target from the Widget component I set the Widget component to have the Owner no see ticked. This means that the owner of that widget, the VR pawn, won’t see that widget rendered ever. It shouldn’t be able to see it anyway since it’s parented to the camera. This works just perfectly fine with a single-player VR set up but as soon as you have other cameras in the world and you attempt any logic with them you also have to make sure that those cameras don’t render the widget. As far as I know, it cannot be done without a custom C++ solution. With scene capture components it’s pretty straight forward though.

@DownToCode , so fast and so perfect, thank you for the help =D and for your VR Content.

[USER=“1395059”]Rangers Johan[/USER] no problem at all I’m happy to add more examples to the project any time