I have two vectors that draw a line together. Vector A is an anchor, and Vector. If Vector A rotates, I need Vector B to rotate around Vector A, effectively making the line rotate by however many degrees I tell it to. The problem is, I’m having a really hard time getting the math to work out, and it’s causing bizarre results. Does anyone know how to do this?
There is a vector node called “rotate vector around axis” that might help you.
I’ll see if I can find a good reference, but if you search in YT there must be many tutorials online.
I’ve been trying that node, but it doesn’t seem to work. I assume I’d put Vector B into In Vect and Vector A into Axis, but that hasn’t worked
If you need to get rotation changing in real time, have you tried to connect to an event tick, or a looping function connected to a Set Timer by Function Name?
Can you share how you setup?
It doesn’t need to rotate in real time. It needs to rotate once and instantaneously.
If you look at the picture below, you’ll see the same room twice, but one is rotated 90 degrees. You’ll also see a debug line that goes from an actor’s location to the player. That line is replicated in the other room, except it isn’t rotating like the room is. Does that make sense?
I think I got it. Do you want the actor to rotate with the room? Is this actor movable or static?
If static, you just need to parent it to the main room mesh and you are good to go. Whenever you rotate the room, the actor will follow. Just drag and drop the actor (or any other actor) to the target mesh in the outliner panel.
The problem is that its not an actor, its a vector. The line you see is just a debug line.
It is not clear to me. How are you creating that debug line?
Can you record a short video clip showing the logic?
A debug line as such is not an object that gets transformed - it is just a function you call, with start and end point. Wouldn’t you just need to get the transformed locations of that actor and the player and call drawdebugline with these values to get a transformed debug line?
The Debug line is just a visual. Here’s what the program does:
uses vector subtraction to draw a line from an anchor point (in this case, the ramp) to the player. It stores the player’s relative location to the anchor point in a vector variable.
reproduces that line in the duplicate room using the relative location variable from step 1.
teleports that player to the duplicate room at the new relative location vector.
The issue is that on step 2 if the room rotates the relative location vector doesn’t rotate with it. That is why both rooms’ lines are pointing in the same direction despite the rooms themselves being rotated 90 degrees from each other.
Assuming that the room is going to rotate during play time, it seems you are tracking start and end location of the trace and you aren’t saving this value in anywhere. Assuming that you have the location saved in a form of variable, if you rotate the room, how is the player going to be teleported to that reference point?
In my understanding, you need three functions:
- One function to save player’s relative location and rotation, then executing the code to rotate the room
- An event dispatcher called on “rotation finished” triggered after the room rotation is completed
- A custom event that bound to the #2 to teleport the player to that target location and rotation relative to the anchor point.
I hope that makes sense.
You can use dot product as a node to calculate relative rotation between two actors.
That’s exactly what it’s doing right now, but the player location vector doesn’t rotate with the room
The location vector won’t give you a relative rotation. If the world rotates, you also want the player to keep the relative rotation?
Let’s take an algorithm as an example:
- Your game starts, and the world rotation is saved as Initial Rotation.
- Player is moving around, and suddenly a function rotates the room.
2.1. Before executing the rotation, the Player’s input is disabled, and he is attached to the world center, keeping its relative location and rotation to it (Attach Actor to Component node, location, rotation and scale rules set to keep world).
2.2. The room is rotated, the let’s say 45º in Z Axis, which also rotates the player keeping its relative location and rotation.
2.3. Player is “detached” from the world center (detach from component, location, rotation and scale rules set to keep world), player’s input is enabled
- The gameplay continues.
Is that what you want to achieve? If not, try to write down the steps needed to achieve your mechanic in a form of algorithm that will help you to visualize what needs to be done and in the right order.
The room doesn’t rotate during play. The room doesn’t actually rotate at all. The player is teleporting into a duplicate room; the duplicate room was rotated by X degrees when I or whoever setup the level (this is for a plugin, so the angle could be anything). Here is a step by step breakdown of what happens:
The player activates the teleporter (this can be done in all sorts of ways, but we’ll use a trigger box in this example).
Both rooms have a duplicate object in them that serves as an anchor point. Once the player steps into the trigger box, the teleporter blueprint/code subtracts the player location from the anchor point location. The resulting vector is a line pointing from the anchor to the player. This is stored in a variable called PlayerDistance.
The teleporter then recreates this line in the duplicate room. It does this by subtracting the duplicate anchor point’s location from PlayerDistance.
The player is then teleported to the end of the duplicate line using SetActorLocation().
If the two rooms are oriented the same, this works, but if they are rotated differently, it doesn’t. This is because the PlayerDistance vector doesn’t rotate around the duplicate anchor point, so the program draws that line as if the room wasn’t rotated. What I need is a way to change the PlayerDistance vector in accordance with the rotation angle. I’ve spent a lot of time doing stuff with trigonometry (not something I’m well versed in) but to no avail. Does this make more sense?
Thank you for clarifying with this algorithm. Here a few notes:
If each room has an anchor point, so you can track its world rotation “yaw”. If its X axis (red arrow) is aligned to the world’s X Axis, then yaw = 0. If a second room is rotated X degrees, so the anchor point should have yaw = X degrees. To illustrate this point, see both cubes in the level:
The left cube has world rotation yaw = 0, the right one has Yaw = 50. This is the angle reference you want to use to rotate your trace vector and determine the correct player location. Which I still believe can be done with Rotate Vector around axis (Z Axis). If the starting point is known, and you have the rotated vector, you can find the target end location.
Here a few options to test using this node
You can set a default yaw for a room as yaw = 0.
Why not setting a fixed teleport location in every anchor point? If the room rotates, so the anchor point and its teleport location.
The only problem is that if I use Rotate Vector Around Axis, the vector will spin in place, not orbit around the other vector
Hey good news, I was playing around and I got it to work. Rotate around Axis finally worked, although I don’t understand why it worked. My only issue now is making the player face the right way. I rotated the character and controller too, but as you run through the teleporter, it jolts you to the side a little.
You can define a default relative rotation, for instance, player will always look at the anchor actor. If so, you can get anchor location, get player location and use find look at rotation, then set actor rotation with that inputs.
This is how I set it up. It works in the sense that the player does face the correct way, but for some reason, after the teleport occurs, if the player keeps running (which is what they’ll be doing almost every single time), it jolts them a little to the right.