Touch Interface Prevents/blocks Widget Component Interaction

Hey,

i used Widget Component on actor with Mobile Touch interface(4.10.1), it seems Touch interface doesn’t let the widget interaction work. when i do Eject it works fine,

also sticks doesn’t overlap with the widget so it maybe a bug.
if it is , is there any other workaround ?

Thanks.

Hello Cgraider,

I have a few questions for you that will help narrow down what issue it is that you are experiencing.

Quick questions:

  1. Can you reproduce this issue in a clean project?
  2. If so, could you provide a detailed set of steps to reproduce this issue on our end?
  3. Could you provide screen shots of any blueprints that may be involved with this issue?

okay ,
first ,

  1. i created new project with Thirdperson template with tablet and 2d options
  2. made a new Widget blueprint
  3. put a button into it, scaled it , set anchors , Compile
  4. new bluepeint class (actor)
  5. add Widget Component
  6. in widget class selected my WidgetBP , set widget attributes : Doubles sided, max interaction distance :100000
  7. set this inside thirdperson Graph

and its not working


also i tried set pivots to 0.

After doing a bit of digging I was able to find that this is a known issue (UE-23843) and that it has been submitted to the developers for further consideration. I will provide updates with any pertinent information as it becomes available. Thank you for your time and information.

Make it a great day

Has this been resolved?

I have provided a link to the public tracker. Please feel free to use the link provided for future updates.

Link: Unreal Engine Issues and Bug Tracker (UE-23843)

Make it a great day

I assume you work in BP.

A work around to that problem is to create a Main HUD
which holds all off the windows (or HUDs) as children
of type Canvas. Though you should never work with the
Initial Canvas or the Main HUD Widget component.

You have to add the widget to the viewport and not the playerscreen.

Build a tree diagram like this:

MainHUD (Widget component)

  • Root (Initial Canvas NEVER Change)
  • Player HUD (Show/Hide)
  • Inventory HUD (Show/Hide)

Hope someone have a better idea…

I don’t get how a widget component can be attached to a HUD class.

Sry that I explained it so badly…

You just create one Widget which contains all UI Information you want to display.
Store information ,as you would display in a different widget, in another widget.
But never touch the “Root Canvas”. Hopefuly my idea gets clearer now…

I found a better solution for the problem:

Always use: Create Widget → Add to Viewport

When removing: Get all Widgets Of Class → Remove From Parent

The important thing is to add all widgets to the Viewport and never use “Remove all Widgets”.
This would result in removing the touch interface.

1 Like

Thanks. I get it now.
However the initial issue was about creating a widget component within a blueprint actor, which would be placed in 3d space. The widget component references a normal widget blueprint class.
This works well, and the widget displays in 3d space.
If the widget has a button, that button functions as it should on the 3d widget.
However, if the touch interface is showing, the button interactivity is lost.

Almost there. Basically have to check if hit location is inside the button geometry.
So the touch event works on the widget component as a whole. Just doesn’t trigger the button in the widget referenced by the widget component.
Hence we check the location of the hit and see if it is in the button area.
the widget being reference is a simple widget with just one button, button_0
PS. This is not working at the moment because I have no idea about the geometry structure. So I just need to check if geometry can define an area and then check if hit location is inside that area.
Maybe someone with more experience can help with this.

The Touch interface/Virtual joystick is blocking touch events for widget components in the level. The issue apparently won’t be fixed. My only solution was to disable the touch interface for the duration of my event and re-enable it after by using ‘Get player controllerActivate touch interface’ and select ‘None’ and, after interaction, do the same, only select the touch interface you want to bring back.

1 Like


I have found a way to work around it, the OnInputTouchBegin event from Widget Component does detect it, so I launch a CustomEvent from the 3DWidget that I have previously saved with a variable