Hi,
I am building a level in VR where I want to have a 3D UI which can be interacted with using the Oculus touch controller. I want to have it so that a laser pointer comes out of the controller, and when the user presses the trigger it will select what the user is hovered over.
I’ve got the laser pointer and interacting with the UI working correctly, however I am trying to build my UI programatically and I don’t know how to find out which sub-widget I have “clicked” on. For my UI I am using a Horizontal Box and adding multiple instances of another widget to the box. I found this thread which gave these instructions:
1)Raytrace->Hit Location->Convert to Widget Local position
2)Use 2d position and feed it into touchable sub-components
3)Repeat 2 for each sub-component if they also have sub-components
4)Override onTouchedAtLocation(x,y) and determine action for that widget.
Using the Widget Interactive component, I have got the hit location and converted that to screen space. I am stumped about how to get the screen position of my widgets that are inside the Horizontal Box though. Does anyone know how I can achieve this?