I have a question regarding VR UI systems.
I’ll give you a simple description of my setup.
Its very similar to google Cardboard demos learning section:
I have a character, that can look around and interact via Touch Input with interactable objects placed in the environment.
The interaction communication is provided via an interface.
These placed interactable objects, for this example a Cube-Blueprint, shouldn’t do much, but show an informationtable on interaction.
Currently the table is a Widget component of the Cube placed relative to it and the visibility is toggled.
Furthermore I have a simple UMG Menu, what is also placed in the level and is interactable through the WidgetInteraction component in my character.
I got both facing the character, but they are placed on a fixed location in the level.
I want them both to show up in front of the character in the direction the user is looking when touch input is pressed.
The menu should show up when touch Input is pressed and no interactable object is found and vice versa.
Both should close when the player is looking away from them.
So my thoughts are roughly, that I have to attach this widgets to my character and set the location based on my view vector in a relative distance to the camera,
but there are many different interactable objects and I don’t think its clever to put such an amount of widget components in the character.
I imagine there should be a way to get one “container” for the job, that shows the content specified to the context.
What do you think how such a thing could be done?
I hope my description makes sense and you get what I want to achieve.
This is my first VR project and I’m quite new to UE and only use blueprints.
I’m very thankful for any advice!