How does this rotational fix actually work?

I did a beginner tutorial about creating doors in Blueprints, and decided to add new features myself upon the result.
In this case I added a different position from where the key would travel to the door.

When the player interacts with the door, the door blueprint creates a new component (static mesh), promotes it to a variable (SM_Key) and attaches it to a Scene component (KeySocket).
Then the door moves the key above the players position (with SetRelativeLocation), from where it can eventually slowly move into the KeySocket position.

Getting the key above the players position is where the problems arose. It would work fine, but only if the door was placed with certain rotations in the level.
OtherWise, the vector between the KeySocket and player would still be perfect, but for some reason rotated to weird angles (around Z axis).

After hours of trying all sorts of things and hundreds of “print” nodes I finnaly got something that works.
Unfortunately I dont really understand how it works.

In the attached image I have my blueprint code that handles the rotation.
Can someone explain to a beginner why and how this actually works?

Image info:
1- In the viewport of the door blueprint, the static mesh SM_Door is rotated 90 degrees so the door faces the X axis.
2- The door will create a vector from the KeySocket to above the player. This is where the key will travel along.
3- The first part calculates the vector. In the second part I did a thing.
4- Setting the position of the static mesh SM_Key works fine now.

P.S. The initial tutorial was from Ryan Laley.

I understand where you’re coming from here.

  1. Bascially, the whole point of using blueprints, is everything can happen relatively inside the BP. You don’t need to refer to world space at all.

Your problems are coming from mixing world coordinates with local / BP coordinates.

When you place the BP in the level, you shouldn’t have to worry about what angle you place things at, it should just work, right? That’s why we do all of the calculations in terms of relative distances and rotations inside the BP.

For instance, if you want to rotate a door ( normal door ), you don’t care what it’s world rotaton is, just it’s relative rotation. Relative rotation ( inside the BP ) will handle it all.

  1. Another major problem early ‘blueprinters’ get ( I certainly had ) is not ‘zeroing’ the components you intend to animate inside the BP.

Let’s say you have a door and a door frame and handle etc inside the BP. And you spend some time wiggling these around to get it looking right. By the time you’ve finished tweaking all the components, they will almost certainly have non-zero transform. In other words, when you look at the details panel for a given component, it’s transform will not be neutral.

Here’s a non-zero component:

See the top bar of this window:

It’s coordinates are all over the shop. A zero component looks like this:

I know it’s mad to have a sphere in the middle of the window, but I’m just making a point. See how the location and rotation info are at zero.

Basically, if you want to animate components and the details are non-zero, you have a world of pain heading your way. Because you have to keep compensating for all this transformation.

How to get round this? Scenes.

I changed the window to a door and made sure the door is non-zero:

See, it’s already rotated by 90 in Z. No good.

Steps to fix this are:

  1. Select the door

  2. Add component ‘Scene’ ( the scene has zero transform with respect to the door, that’s important to notice )

  3. Drag the scene onto the door in the components window ( this will detach it from the door ):

drag.JPG

and then drag the door onto the scene:

What was the point of that? Well, now look at the door’s transform, it’s zero.

Happy animating.

Hope this was useful :slight_smile:

Thank you for taking the time to respond!

I have been pondering your remarks about Global Space vs Relative Space and tested your theory with lots of Debug Lines and Print Nodes.
For this testing I have undone the 90 degree turn of the door mesh in viewport so its now a zero transform.
And now I do grasp what the unkown code was doing.

In the first part of Image.3 the vector KeyPath is being calculated perfectly.
When the door draws the vector KeyPath in Relative Space it still draws the vector, but with an offset (only is an offset from the perception of Global Space).

So the second part of Image.3 has to rotate the vector to compensate the global rotation which it does with RotateVectorAroundAxis.
And the reason the WorldRotation.z has to be flipped is because otherwise it would only add to the rotation offset, not nullify it.

Thus my question is answered.

Great that you now understand what’s going on.

You can probably also see that sticking to relative space helps :slight_smile: