Determining the coordinates of pressing on the screen, when pressing a button

Step 1
Getting screen sizes

Step 2
Calculating the area to show the button.

I take the screen size from a variable, expand it into float. I divide X by 2 because I will show on the right side of the screen. I also retreat to the right of max X=20 pixels in the form of a side and the size of the image that will be shown.
From Y I act as follows - I retreat from above the side. Down retreat side + picture size.

I get a randomly 2D vector for the display point of the picture.

Step 3
I display debug information in the form: coordinates of touching the screen (Location from InputTouch) * Coordinates of the point of displaying the picture.

If everything is done correctly, then when you click on the picture, the coordinates of the touch should be slightly larger than the coordinates of the display of the picture.

And here’s what I end up with:


The first line of TO

x = 709.000 y = 400 z = 1.0 - touch coordinates obtained from InputTouch (Location). I’m touching exactly in the middle of the picture.

X = 1280 Y = 720 - the screen size from the variable, on the basis of which the calculations of the image display field are made, as well as the display point of the picture. The picture has dimensions 100x100.

X = 1006.342 Y = 553.756 - coordinates from the variable used to display the picture. (Var (vector2D) PointPricel)

The second line SetShot X = 1006.342 Y = 553.756 - coordinates from the variable by which the picture is displayed, from the function of displaying the picture (for rechecking).

As a result, something is wrong with me. What exactly - I do not understand. The logic and its execution are correct, in my opinion. Why the data do not coincide and so differ is completely incomprehensible to me.