This is my touch interface layout:
I was tired of constantly tweaking my conditions that calculate what action should be executed when user touches x,y coordinates.
So I had that idea to paint zones that are representing same touch events. Sadly currently there is no easy way for this in unreal.
My hax way is to make blueprint with 3d objects that represent layout (for easy read they are colored, each color is different Z distance) . I trace Z distance on X,Y coordinates, result tells me which action “zone” i touched.
This whole blueprint feels quite hax way of doing things, but it works well.
So my idea is to add property for UMG objects that holds zone ID for touch events. Or maybe whole new class of invisible objects that hold such id. So we can paint our touch areas in UMG.
PS. would be also great if all trace functions could return some property value from hit actors. Some integer or string value.