Why are some rotation values negative and others not?

This makes very little sense to me. Anyone got any idea why?

I used GetBaseAimRotation on my character, split the pin into roll/pitch/yaw to get the yaw for a compass HUD element. It gives me 0-180 to the right, then goes to -180 to -0 on the left. Fair enough, although I don’t quite see why the yaw rotation should come out as anything other than 0…360 or 0…1 or whatever. Rotation should be rotation after all.

So that one was a bust, next I tried GetActorRotation. Same deal, its got -0 to -180 then goes to 180 back to 0.

Then I try GetControlRotation and this time it goes 0…360 as I wanted.

My point being, why is it not consistent? This is literally meant to be representing the same rotation values. Yet the ranges returned are wildly different. I mean, they are all pointing to the same actor, they all presumably get their values from the transform, so why are they interpreted differently? I was expecting a continuous value from the yaw component of the rotation (as per the control rotations values), so what gives with the other values? Look like some cosine angle value or something.

Its the kind of thing that trips people up if they don’t know about the various options, so I’m wondering what the thinking might be for this kind of thing.

Maybe post #3 here can give a little background:

if you say what you are trying to do then maybe you can get some advice.

Do you want to compare 2 rotations or interpolate, or …?

Do you limit it to yaw-only rotations or real complex rotations?

If you just want it in range 0-360 then just add 360 and do a modulus on that by 360 and it will get fixed (unless you have negative rotations beyond -360)

return FMath::Fmod(yaw+360.0f,360.0f)

Oh I already did what I wanted. My question was about the thinking of having different rotation representations under different circumstances. Specifically values that logically sound like they should give the same result (rotation of an actor) giving different results depending on which node you use in a blueprint.

I’m trying to wrap my head around that way of thinking. With the hope I can at least start to discern a pattern to it. I guess I’m trying to understand the Unreal programmer mentality, because it feels a bit alien to me still (for instance I’d never return different values without an explicit change in purpose).

Maybe Epic should do similar to how it is done in Ogre
Have a Degree class and a Radian class that comes with special logic for comparing them or reading the value so it is in a known interval

As you get the Yaw value as a Float, then as far as a computer is concerned then both positive and negative numbers are equally valid, and it is not uncommon in programming that things representing the same thing can be stored in different ways, and you are free to store it and compute it in the most efficient way you can.

Sometimes Yaw and Pitch are enough, and can keep things simple,
Sometimes Quaternions are more flexible allowing interpolation,
Sometimes Matrices are efficient at transforming large numbers of vectors,
Sometimes you need to go from one representation to an other for those reasons, as certain computations are easier/faster for some specific representation.

Yes, BUT :slight_smile: here’s the issue. The rotation value at least should be consistent in its value. If I ask for pawn rotation and actor rotation for example, I should get the same value. If a camera is attached to the pawn and I get its rotation I should get the same value. If the controller is attached to the same pawn, I should get the same value.

I’m not griping about what the value represents, I’m just asking why they would be returned differently under different circumstances. They are the same value (the rotation of the actor/pawn) returned as a Rotator, so theoretically they should be the same, only they’re not and I’m wondering if there’s an understandable reason why they would be returned differently? Is it because the controller version of it has a different use-case and they do some internal conversion in that case? I’d like to understand the logic used because its part of how the engine is designed and its important for me to understand the design philosophy of the engine as I feel less productive without knowing that aspect of programming with it.

Its the same as why they call position “location”. I don’t understand why. What I want is to get into the mindset of an Epic developer.

It’s worth bearing in mind that a Rotator, as a variable type, is blind to the difference between -90° and 270°. So getting the Rotation of something IS returning the same value. When you break a rotator into float values, it makes what I assume is an educated guess as to HOW to parse that into float values.

For example, a Controller represents an absolute rotation in 3D space. Usually, setting a Control rotation is performed in this way. But a pawn’s rotation is frequently thought of in terms of its relation to the control rotation.

Consider an Aim Offset. It needs float values to drive its blend coordinates. It makes sense to think of that range as going from -180 (far left) to 180 (far right). A 0-360 range is not intuitive for a blend of that nature, since you don’t usually blend from “far right” to “far left” in that way, you blend ACROSS straight forward.

In that specific situation, it’s more helpful to have the delta rot between control and actor report a range that’s -180 to 180 rather than 0-360. I feel like most use-cases work this way, where someone at Epic decided that in this circumstance it’s more often useful to get one kind of range than another.

The important thing to remember is to break Rotators as INFREQUENTLY as possible. When you need the difference in yaw between two rotators, don’t subtract yaws, take the delta rot between the rotators and get THAT yaw. Rotators are a special case because they have circular-wrapping values, there’s always going to be scenarios where the reported degrees are annotated wrong compared to what the user is expecting.