Radar: convert target actor locations wrt player character's location & orientation


I am trying to create a radar (using UMG) screen. I am accumulating all target's objects that are in the radar's scan range in an array. The center of the radar is the First person character(FPC). The circular radar is shown in the bottom left of the screen shot.
Now, I wish to convert all the target's world locations with respect to the FPC's location as well as orientation i.e. all target location values should be converted assuming that FPC's location is (0,0,0); and should also take FPCs rotation into consideration. For Example, if a target is exactly in front of FPC, and then FPC rotates to the right, the same target object should now appear on FPC's left side, on the radar screen.

 I am a beginner to Unreal, and came across a function named 'Inverse transform location'. I tried using it, but the results don't seem to behave as desired.  I am not sure how it is to be used to achieve the above desired behaviour.

Can anyone please help me ?

P.S. : The red lines in the pic denote a sphere that is due to the sphere tracing that I am performing, in order to get the world locations of any targets that lie inside the sphere.


Inverse Transform Location is the correct node for this; can you screenshot your Blueprint setup? You’ve probably got something wired up wrong

Here is the blueprint setup, The function “Radar Convert Locations” has as it’s inputs: The FPS(Radar Actor) and an array containing the world locations of scanned targets(“targetLocation array”).
The function aims to convert the target locations(in a for loop) wrt Radar Actor’s location and orientation. These converted locations are then added to the same “targetLocation array” at the concerned index.
Currently, the converted coordinates are printed to screen for debugging purposes.
I also have doubts as to whether “Inverse Transform Location” also accounts for change in rotation of the Radar Actor automatically, or do we have to implement it separately ? If yes, how do we go about it(do we use “Inverse transform Direction” ? No idea what that does though, the documentation is not detailed). There isn’t much info in Unreal’s documentation regarding this function.

More specifically, the converted Y coordinate behaves strangely, when the FPS moves to the left of the target, the target's converted Y coordinate is a positive number(assuming FPS's position is (0,0,0) and facing straight).

  However, when FPS moves to the right of the target object, the new Y coordinate of target is still a positive number(i believe it should become negative).