My end goal is to build a simple but custom touch UI interface in a mobile game. To do this I need the screen space location of where the user is touching.
I am able to get this all working in the editor using the mouse position, then scaling it by the DPI myself or by using the “Get Mouse Position Scaled by DPI” node. Since I want this to work on touch I want to use the “Get input Touch State” node to return the X,Y screen space location of where the user is touching.
I am getting drastically different results using the get touch location versus getting the location of the mouse, despite having “Use mouse for touch” selected. In theory i’d imagine these would be the same values. When using the mouse values everything works perfectly, but when getting the 2d vector from the Input Touch State note, no matter what I do, the position it provides seems to be consistently very off.
Example of the two different values below.
LogBlueprintUserMessages: [Char_C_0] Touch:X=315.000 Y=80.533
LogBlueprintUserMessages: [Char_C_0] Mouse:X=309.000 Y=280.000
LogBlueprintUserMessages: [Char_C_0] Touch:X=315.000 Y=80.533
LogBlueprintUserMessages: [Char_C_0] Mouse:X=274.000 Y=312.000
LogBlueprintUserMessages: [Char_C_0] Touch:X=315.000 Y=80.533
If anyone has a work around for this it be appreciated. Thank you!