UMG 3D widgets work based on position

Hi I’m working on a project where we have several interactive elements such as remotes that take user input. I’ve been trying to accomplish this by parenting a 3D UMG widget to my model which will have transparent buttons. In testing this with non transparent buttons I’m getting some funny results. My UMG widget is very simple, just a canvas panel with three buttons that have their “hover” tint set to a yellow color and a tooltip text to come up. The buttons only seem to work however in certain spots within the canvas panel. In the image below the buttons circled in blue work just fine. The middle right section of the canvas panel however seems to be a dead zone. The UMG button that spans the whole center only works on the right side. As soon as my mouse cursor makes it to the left side the highlight works. Oddly enough if I move this whole button up in the canvas the whole button works. I originally thought it could be the trace distance but setting that to a very high value still yields the same results. I know that I can achieve what I want using trigger boxes and implementing a custom hit test in blueprint but I wanted the ease of setting up the buttons in UMG and the automatic hover highlighting was a nice feature. Any ideas of what could be causing the problem?
By the way this isn’t an isolated case. I’ve had similar problems with 3D UMG widgets having dead zones on other projects. Thanks for looking.

So I ended up just using trigger spheres to do this for the remote asset and it works just fine. I would still love to know though if anyone else has run into this issue with UMG widgets. I have several more assets to set up and some of them will have to be UMG widgets, so I anticipate dealing with this problem more. In fact I’m already seeing the issue on a phone asset I’ve set up. Although I could set this one up with trigger boxes as well, changing textures when they are clicked, it would be fantastic to utilize the ability of UMG to animate transitions for an interactive phone screen!

I set up a quick test and am having the same issue I described above. The button turns light green when the highlight works. The right side of the button does not work. Hovering or clicking doesn’t respond. The left side however works but it is sporadic. Sometimes the hover works and sometimes it doesn’t. This isn’t a complex widget at all, just a canvas panel and a single button with text over it. The text is set to “hit test invisible” so it shouldn’t even matter, at least from my understanding. Regardless, I get the same results even if I delete the text. I’m pretty baffled with this and would love an official opinion. Is this a bug? I’m testing in a new default scene right now. I’ll report back with my results.

What version are you using? Looks like 4.8.

Can you reproduce this in a sample project and provide it? Note that all collision detection with widget components in the world goes through raycasts, same as any physics colliders in the scene, so if you’ve got anything blocking it it’s not going to work.

After some extensive testing I think I’ve made a little headway. This is 4.8 by the way. I had this hand with the phone parented to the player character and it animates up into view when you access it. The problem is related to how close the UMG widget is to the camera. If I move the phone further away from the player the button highlighting works as expected. I thought that maybe the hand being inside the capsule component was causing the problem but I setup a simple test scene and can’t reproduce the behavior even if the widget is fully inside the capsule component and super close to camera. The hand and phone have no collision on them at all. Ironically when I enabled collision on them using “Use complex collision as simple” The button highlighting worked better although still had small dead spots.

I’m going to test a bit more and see what I might be doing wrong here and I’ll try and reproduce it in a sample project to send. Knowing that it is just using raycasts is helpful. I’ll post my updates.

So the plot thickens a little. I found that the widget doesn’t even have to be a 3D widget, so there goes my camera proximity theory. Creating a regular widget and drawing it in the viewport has the same problems if the widget is center screen. Moving the widget anywhere else it works. Unfortunately I still can’t reproduce the problem in a clean scene. When I set it up in a clean scene it works just fine. Something in my scene has to be blocking the raycasts. I’ll keep digging.

If it’s not in the 3D, then it’s not raycasts. Nothing in the scene could be blocking it once you make it a regular 2D widget, you must have something in your widget stopping it. You probably have a Visibility: Visible (Which means hit testable) widget above that side of the button. With the 2D widget on screen, open the widget reflector (Shortcut, Ctrl+Shift+W), and pick the area that isn’t clickable and see which widget you’re actually hitting.

Alright I finally found the issue. It was a stupid mistake of mine. For anyone that gets into this mess in the future. The problem was that I have two widgets in my viewport. One of the widgets has a bunch of buttons that are all over the screen but are disabled by default until they get enabled in blueprint. Unfortunately these buttons although disabled and not drawn have their visibility attribute set to “visible” The cursor was hitting these guys even though they were invisible to me and disabled. By setting these buttons visibility to “collapsed” or “hit test invisible” the problem goes away.

&stc=1

ha, you called it! I didn’t see your response till I had found the issue, but thanks for letting me know about the widget reflector. I did not know that existed :slight_smile:

How did you do the cursor setup? I mean, you need to use a Widget Interactive Component right? In that case, how did you draw the cursor?