What is considered forward in UE4, Y or X axis?

So Z axis points up.

When creating characters, it seems like we usually make the character point in the y direction, -y in 3DS max since its coord handedness must be flipped. But then I have to rotate the character -90 which makes me think maybe the X axis is the forward vector.

I know in my engine where I had a right handed coord system, -z was forward for me since y was my up. But in unreal it makes sense that y is forward since z is up and the coords are rotated.

will help me know which direction my objects need to face in the modeling program to be in the neutral position and what is the neutral direction of an object in the component editor.

1 Like

Yup, UE4 Forward axis is +x so from luckily for you with Max you already have Z-Up, in C4D land we are stuck with Y-Up and on import our Z-Forward = Y Forward in UE4, same problem with Maya. Hopefully they’ll do some fixes for in the importer one day…

Maya you can actually change it to Z-up axis work area so it becomes a moot point when working in maya - to - Ue4.

Not really as in Maya Y is forward when you change to Z-Up, still wrong for UE4 which is X Forward.

Are you sure that the Front axis is X ? when pressing the selecting the front view in the viewport it sets the camera at the Y axis as front.

Yes they are correct. “Forward X” is how all your models should be oriented in UE4, especially for skeletal meshes. Static meshes you place in the world aren’t as important, but it is a good habit to orient all meshes in way, aside from something like a cube, anything that has a clearly defined “front” side should be pointing forward X.

I will never get used to UE4s XYZ. :frowning:
Most of the time Z is up here, but with 2D planes Y is up, widgets have Z-order which is the depth not height, etc. inconsistency.

X is Forward, but you need to rotate your Character 90° so it looks Y forward.:cool::cool:

Holy cow, going from unity to UE4, I always thinks of x as being horizontal and went down a math rabbit hole trying to figure out what I was doing wrong. X is forward, cripes

well I’m guessing that engine hasn’t been made by mathematician as xyz are placed a very unusual way for me
X should be horizontal, Y vertical and Z deepness, if only they inverted Y and Z, but no, they changed everything with X deepness, Y horizon and Z vertical
I don’t even find that logical as if they wanted to use axis like a computer Z should be top to bottom and Y left to right, and even that they managed to reverse it

so in engine, if you want to go far away, you need to go negative X, if you want to go right negative Y (ok up is positive Z but well)

Did you mean Y is down? (Y- is up in widgets)

I think what needs to be made clear is that the “front” view in the editor looks in the Y- direction. Meshes are imported facing Y+. Then when these Y+ facing meshes are used in the Pawn class the mesh is then rotated -90 deg on the Z axis so that the mesh is now facing X+. works nicely with the MoveForward method in the Pawn class which positively affects the X axis. The GetActorForwardVector method in the Actor class will also return “forward” in the X+ direction.
The way I like to think of it is as a side scroller game and the character is facing X+. Now you become the character and are in first person view. You are still looking X+.

So what should we set which exporting fbx form blender, z up, x forward?

I had good success with making my characters face down X instead of Y but market place assets all have characters facing down Y so I went back to that so I can reuse animations from the marketplace.

In Unreal +X is forward, +Z is up, +Y is left or right, I forget off the top of my head. I now always make everything face down X except for characters which I have facing Y and rotate them 90 degrees in game.

It’s annoying to have to account for the 90 degree rotation sometimes because I do some things with IK on my character animations and it messes with the different coord spaces, but I’ve worked out most of the kinks. A lot of my IK is in world space so the math works out.

One annoying thing I’ve had to do was take a rotator of a character’s head direction, and apply a 90 degree offset to it in C++. Otherwise the pitch ends up being roll due to the space mismatch and as your character looks up they actually roll their head side to side. I didn’t find a way to do in blueprint since blueprint just lets you add to the roll, pitch, yaw and doesn’t expose quaternions or ways to transform a rotator by another rotator.

FTransform URDBaseGameplayStatics::WorldTransformToCharacterModelWorldTransform(const FTransform& InTransform)
    FTransform Res(InTransform);
    Res.ConcatenateRotation(FRotator(0.f, -90.f, 0.f).Quaternion());
    return Res;

Future dictionaries, when you look up the word “inconsistency”, will have a section mentioning Unreal’s mesh transform system as an example.

Sorry to reheat necro-post. But I’m new to UE and just starting to notice the -X facing thing. For example I have very basic project and the ‘Player Start’ object in the scene by default seems to be facing left by default.
Will ever be changed? It seems very counter-intuitive. Perhaps some clever soul could add the ability to setup the axis order ourselves?
Aside from that… Does anyone have a some common Gotcha’s that I could be aware of when dealing with kind of axis order?

My very first day in Unreal and I spent hours wondering why the Rotator Z was rotating me on the ‘wrong’ axis, lol. Then I figured Yaw was Z axis. But I hadn’t dream’t that minus-X would mean forward :stuck_out_tongue: !

As an artist coming from Max and learning UE, one threw me for the first week, too. I’ve since mapped my keyboard shortcuts so “F” is right, “R” is front, “L” is back, and “K” is left. I’m OK with +X being “forward”, but I don’t know if I can unlearn 20 years of my ‘space cube’ orientation.


I mean is arbitrary anyways. I’m pretty certain you can re-orient the project any direction you want if default isn’t working out for you. As long as you know which directions you want everything to go in then you can do whatever you want.

The only catch might be with the kill plane that kills you if you fall out of the map. You can probably change easily though too so don’t be afraid to break from default conventions.

Math calculations in unreal assume X is forward, Z is up, and Y is right. So that’s kindof what you’re stuck with. It wouldn’t be easy to reconfigure that since every 3D math function in unreal would have to take that into account.

And Rotators are basically:
Yaw is rotation around the Z axis. Yaw is Left/Right and rotates around the axis that points up.
Pitch is rotation around the Y axis. Pitch is Up/Down and rotates around the axis that points right.
Roll is rotation around the X axis. Roll is “rolling” Left/Right and rotates around the axis that points forward.

It’s a totally arbitrary decision of which axis means what, but once it’s chosen, the entire 3D math of the engine is built around those assumptions.

Like if you take any Quaternion or Rotator with 0 rotation, and have it give you the direction vector, you’d get the vector (1, 0, 0).
If they chose Z to be the forward axis, you’d get (0, 0, 1).

You can totally write some functions that remap things but it’s easier to just get used to the 3D space of the tool you’re using, and when going between 3D applications with different spaces, do that remapping there.

I was able to easily tweak things with the Platformer starter project. Epic build the platformer level such that +Y was Left Right, +Z was up, and +X was the vector pointing out from the 2D game plane. confused me and I wanted +X to be Left Right. I was able to easily change all the logic to assume the character moved along X and was constrained to Y, and rotated the sample starter level 90 degrees. In case it was a totally arbitrary decision of whether or not X or Y was the character side to side plane in 2D. All the 3D math functions still assume X is forward, Z is up, Y is right.

1 Like

So from what I can tell, when I bring a rig in or an object in with .fbx, it automatically points along the Y. Both Blender and Maya are consistent in terms of orientation. Y to Y from Blender and Z to Y from Maya.

The Alembic settings for Blender and Maya also orient it properly. 90 rotation on the X and -1 Y scale.

I have not dug deeply into the math inside Unreal and I am not a programmer. The part that is confusing is the fact that the orthograpic views and the default camera with no rotations point down the X as forward. Or not confusing. But is how the math works.

But even the default mannequin and recent mannequin control rig all face down the Y. It seems to be there is a consistent effort to keep the character animation workflow consistent with external apps, regardless of the other orientations, camera and ortho views.

Just my take on it. And I would think for the most part - from my experience - keeping the characters facing the Y is the way to work. And in my opion the best thing is to keep orientation and build your scenes down the Y with X as side to side just as it is in an external app.

But I work mostly with cinematics. I am not sure if would cause other issues I can’t see from a game development perspective.