How to use two Touch Interface at once?

Hey guys, so I’m building a mobile fps game and I want to enable players to use half the screen for aiming and the other half has buttons such as ‘Fire’, ‘Reload’ etc. The problem I’m facing is that I need the ‘Prevent Recenter’ option in the touch interface enabled for the buttons and disabled for the Aiming Joystick. And the only way I have found is to make two different ‘Touch Interfaces’. So I really need a way to use 2 different ‘Touch Interfaces’ at the same time. One for buttons and one for the aiming but I have no idea if it is even possible and I haven’t found a workaround for this either.
I need to do this because when the Prevent Recenter is disabled, it makes using buttons difficult as they just keep changing position when pressed and it may confuse players. And if the Prevent Recenter is enabled then the aiming is weird.

Please help me with this if you have any suggestions. Thank you.

Short version:
Do your own touch input logic. Use umg widgets only to display stuff not for touch input.

Longer version:
Track all touch points (i think its up to 10), calculate their coordinates, compare to widgets coordinates (hard part as there is no easy way to get coordinates of widget in screen space).

My whining about unreal interface:
UMG was not done for games or touch interfaces, it is almost unusable for mobile games that need some fire button and drag/drop stuff. So suffer in silence or move to engine like GODOT.

Thanks for the answer but, I don’t really know how to track the touch points. Can you please tell me or link a tutorial that explains how to do that? That would be a great help.