I am having some trouble and would like to know if anyone know what is going first.
I am working on a VR game and set up a interactive widget, placed it in a blueprint and was having fun with the new experimental widget.
It worked perfectly fine with both hovering and clicking ability after I set up the pointer key as left click mouse button.
So I moved on with my project and twerk other settings/put in meshes/set up other blueprints in the project.
All of a sudden the widget interaction has no effect to any widget at all. Not even firing a hover event.
I turned on the debug and I was able to see that the interaction actually passed through the widget so it was not the distance problem.
I would like to know if anyone has experienced this issue since I have did a fair bit of research and found no one was having this problem yet.
I changed the widget interaction source to custom and manually plug in the source every tick and the interaction worked again.
This could be a quick fix for people who want to solve it desperately.
However, it is not an ideal solution for me because I have to minimize the use of tick events as I am working on a VR game.
So the problem persist and I hope someone can help. Thank you
Its worth mentioning I ran into a similar issue. I had to ensure that the virtual use index both of of my widget interaction components were different from each-other – so its less about changing them than it was making sure they were different.
I ran into this error and found the same thing as “akjim”**,**that setting the WIC’s (Widget Interaction Component) Integration Source from World to Custom + setting up a line trace that would pass in its Hit Result to the WIC’s Custom Hit Result on Tick would work for me as well, but I didn’t want to use that (for VR Purposes). Instead, I dug into the Engine code and figured out the problem.
I came from VR_Works UE4.15 -> VR_Works UE4.18, so I compared the WidgetInteractionComponent code from both of them “D:\UnrealEngine\Engine\Source\Runtime\UMG\Private\Components\WidgetInteractionComponent.cpp” and found they went from doing a single line trace to multi-line traces. This means UE now cycles through all their hits and tries to assign the correct one to their normal result that is returned, if they ended up hitting a widget. What happens instead is that if they fail to find a widget on any of the multi-hits, they stop searching. I was hitting other overlap collision elements in my scene (was tracing on Visibility) before I was hitting my 3D Widget, so the for loop would instantly break and never find my widget(s). I commented out this break on line 210 and this fixed the issue.
I also added a break on a newline after 204 so that the first widget I’d trace-hit would be used and the engine would not continue searching for other widgets. Example: Widget1 is in front of Widget2. Widget1 will be interacted with first, over Widget2, until removed from the line-trace to Widget2.
Just for future reference, for anyone that searches for solutions to this issue via Google, I would like to add that I had the exact same problem. The solution turned out that the collision sphere on the object that had the widget was set to Overlap for Visibility and Camera. By setting those to Ignore, my widget started working. Anything random in the area that can effect the Visibility and Camera Channels in the SLIGHTEST can stop a widget from working. Took me hours to figure that!
Hi, I’m having the same issues with button/widget being hard to click in VR.
The widgets are working fine in bp but once you in VR they stop being clickable/hover and it’s really random. A widget that’s working fine from a distance will stop working if you’re a bit closer and the other way round. I can’t find any pattern in the way that they stop being active. It’s not just a distance thing, it might be a trace issue passing through or something.
In the meantime i really need them to work at all time, so i’m gonna set up the widget interaction to custom, as **FatalFeel **mention, but what do you plug into Hit Result?
My issue was that the widget interaction was rotated and always pointing to the right/right, even though the motion controller had it pointing forward. This was because the motion controller was being attached to a socket on the character skeletal mesh, and using whatever orientation that socket had.
The fix: turn on “show debug” for the widget interaction. Then, use “AddLocalRotation” to rotate the WidgetInteraction component to get it to point forward.
My issue was that the skeletal mesh’s collision mesh I had the widget attached on was interfering with its detection. Thus I set up proper collisions on it and all worked perfectly. Prior the widget component’s window wasn’t even detecting the debug line coming out of the player’s Widget Interaction Component.
My issue was that I had many 3d widget buttons as widget components in one Actor all sharing the same physical space to create a set of menu “screens”. I was turning on and off their visibility to show only the currently desired set of buttons but I found that I had to also toggle collisions for the components so that when they were visible, collisions were enabled and vice versa.
For those who are having this problem with the bigger project, it obviously because of collision between the widget bp and other collision box. The solution is you need to check your trace channel in widget interaction either it blocks or overlaps in other collision boxes. My best solution is to create a new collision channel through the project setting with a default setting of ignore. Then reapply your Widget interaction and widget in bp(make it overlap and you can leave the rest to ignore) with those new collision channel. Well that solve the problem