I did a beginner tutorial about creating doors in Blueprints, and decided to add new features myself upon the result.
In this case I added a different position from where the key would travel to the door.
When the player interacts with the door, the door blueprint creates a new component (static mesh), promotes it to a variable (SM_Key) and attaches it to a Scene component (KeySocket).
Then the door moves the key above the players position (with SetRelativeLocation), from where it can eventually slowly move into the KeySocket position.
Getting the key above the players position is where the problems arose. It would work fine, but only if the door was placed with certain rotations in the level.
OtherWise, the vector between the KeySocket and player would still be perfect, but for some reason rotated to weird angles (around Z axis).
After hours of trying all sorts of things and hundreds of “print” nodes I finnaly got something that works.
Unfortunately I dont really understand how it works.
In the attached image I have my blueprint code that handles the rotation.
Can someone explain to a beginner why and how this actually works?
1- In the viewport of the door blueprint, the static mesh SM_Door is rotated 90 degrees so the door faces the X axis.
2- The door will create a vector from the KeySocket to above the player. This is where the key will travel along.
3- The first part calculates the vector. In the second part I did a thing.
4- Setting the position of the static mesh SM_Key works fine now.
Bascially, the whole point of using blueprints, is everything can happen relatively inside the BP. You don’t need to refer to world space at all.
Your problems are coming from mixing world coordinates with local / BP coordinates.
When you place the BP in the level, you shouldn’t have to worry about what angle you place things at, it should just work, right? That’s why we do all of the calculations in terms of relative distances and rotations inside the BP.
For instance, if you want to rotate a door ( normal door ), you don’t care what it’s world rotaton is, just it’s relative rotation. Relative rotation ( inside the BP ) will handle it all.
Another major problem early ‘blueprinters’ get ( I certainly had ) is not ‘zeroing’ the components you intend to animate inside the BP.
Let’s say you have a door and a door frame and handle etc inside the BP. And you spend some time wiggling these around to get it looking right. By the time you’ve finished tweaking all the components, they will almost certainly have non-zero transform. In other words, when you look at the details panel for a given component, it’s transform will not be neutral.
I have been pondering your remarks about Global Space vs Relative Space and tested your theory with lots of Debug Lines and Print Nodes.
For this testing I have undone the 90 degree turn of the door mesh in viewport so its now a zero transform.
And now I do grasp what the unkown code was doing.
In the first part of Image.3 the vector KeyPath is being calculated perfectly.
When the door draws the vector KeyPath in Relative Space it still draws the vector, but with an offset (only is an offset from the perception of Global Space).
So the second part of Image.3 has to rotate the vector to compensate the global rotation which it does with RotateVectorAroundAxis.
And the reason the WorldRotation.z has to be flipped is because otherwise it would only add to the rotation offset, not nullify it.