Hi I’m wondering if anyone can help or if this is even possible.
So the ideal easiest VR menu would be to place your buttons in a UMG widget and then insert this as a 3D widget within a blueprint and place that blueprint in front of the camera.
When doing a trace on this however all I can return is the blueprint holding the 3D widget itself and not the buttons within the widget.
I noticed in the VR template floating around here the menu is manually created out of individual components - yes this works but for ease of setup especially for something more complicated like a multiplayer server list how do we trace on the buttons within a 3D widget?
For now UMG widgets do not handle custom input (collision or ray tracing) automatically, so a workaround is required. I used custom collision forwarding with a Leap Motion for my use case, but the same principles apply to ray-traced input.
You can make complex widgets compose without too much additional code if you handle input in a re-usable way. In my instance I pass input from the composed menu widget to all my sub-widgets (e.g. buttons and scroll lists) and let them handle the forwarded intersection. They are all of a ‘touchable widget’ class, which means they all have the same input forwarding methods, so the relevant widget will handle interaction in an expected way. For example the scroll list uses scroll offset + intersected location to determine which item you’re selecting inside the scroll box, whereas the button will just check if the input is within its bounds.
So the flow would be:
1)Raytrace->Hit Location->Convert to Widget Local position
2)Use 2d position and feed it into touchable sub-components
3)Repeat 2 for each sub-component if they also have sub-components
4)Override onTouchedAtLocation(x,y) and determine action for that widget.
With that setup you can build up larger complex menus by adding these sub-widgets together with one function call forward per widget. This seems to be a popular request so I want to take what I’ve made, and make it a bit more universal, maybe as a set of blueprints that can handle both ray-tracing and collision. That would make an easy workaround available until epic decides to add proper VR input to UMG.