Does the Unreal C++ API support access to vertex normals?

I am using the Unreal C++ API to raycast from a Pawn to a static mesh, grabbing the hit face, and would like to get the vertex normals of the hit face. I am able to get vertex positions (PositionVertexBuffer) and the vertex tangents (StaticMeshVertexBuffer) from the face, but it seems there’s no way to get the vertex normals. I’m drawing this conclusion from reading the FStaticMeshVertexBuffers docs.

I believe I am overlooking something fundamental. Can someone advise me on how to get the vertex normals from a static mesh face?

Here is a reference image to illustrate the vertex normal data I would like to obtain.

you can with a procedural system, dynamic mesh , pmc , rmc and so on.
You can’t do it with a static mesh with basic unreal api, you have to dive into these systems that are under unreal. It’s a vector for normals.
There are functions in c++ to lerp a vector or between two vectors.

Good luck.

Thanks @GigiK-K-K .

I was planning to do modeling in Blender and then import the mesh into Unreal. However, I’ll take a look at how I might bridge from Blender to one of the other mesh types you listed. Thanks.

Indeed, the vector math is simple for calculating the vertex normals. I have considered doing this calculation on the fly by getting all the faces that contact a vertex position, getting their face normals, and averaging the face normals. However, this requires fetching all the faces that contact a vertex position, and this is also something that I could not locate in the C++ API docs.

I’ve considered lerping between face normals but that would not quite give the result I’m seeking.

The Tangent data is actually the tangent basis and thus contains normal, tangent, and bitangent.
I believe they are interleaved, so the size of each element in the array is 3*sizeof(Vector<float>) or 9*4 = 36 bytes Presumably the normal is the third of the vectors (because it’s the Z axis of the tangent basis) but I’m not 100% sure of that. You can run some simple experiments to find out.

You can also use VertexTangentZ to read the normal.
(Again, I’m assuming X == UTangent, Y == VTangent and Z == Normal becuase that’s the normal convention, but I haven’t 100% verified this.)

You want all the vertex points to point outward ?
Blender offers this simple, in it’s menu there is recalculate normals outside.

If you want to lerp with C++ you should lerp at 0.5 that is inbetween, that will get you the middle point outwards.

If you want you can still modify normals in unreal with these systems , but you got to learn to use them, your normals vector is a regular FVector, it knows it’s a normal vector by section order when you load the section to the component, first you got vertices and other things, they come in linear fashion. You can do anything before loading sections with it.

A general rule to point outwards/ inwards or any way you want it’s as simple as this for procedural meshes.

normals.Init(FVector(0, 0, 1), 500.f);

Normals being your Template Array data in this case.

After this you can revert the mesh to static mesh, by creating a static mesh out of it, you have a tab option.

You can also lerp the normal vector on the axis to make it phase to a degree, linear interpolation.

It’s simple with blender, recalculate normals outside, when you export, select the fbx options for export make sure you select faces for export under the geometry tab that is in the export section page for fbx, it includes the vertex normals, or you can choose normals only. In unreal you can now import the mesh inside and use a library maybe to load the mesh sections with one of the systems mentioned by me.

To view how the normals are pointing in unreal, you simply click the mesh twice and you will go the the mesh tab and on the tab of the mesh you will have normals, collision and so on, once you click the normals it will show you how they point, in what direction.

To get the normals with what ever you want you have to calculate the direction of the normals between the object of refrence and the source, player, the casting object or whatever. So you have to make a sort of calculation inbetween two objects tho I’m not sure and I never tried, but it worked with other things, like getting distance between objects, getting data off of an object when you face it in front, trigers and so on with raycasting by single line. Maybe you have to rotate the vector to a degree that it points at the ray, meaning you have to get a forward vector and transform it to rotation, so your normals per haps can always point forward towards the ray.

It’s not that simple :slight_smile:

@jwatte Your suggestion to use Tangent data sound very promising! It should be straightforward to parse the interleaved values. Could you elaborate on how to access this attribute? I could not find it in the docs.

I have previously tested VertexTangentZ and I found the vector was identical to the face normal (e.g. [0, 0, 1] for a flat surface). Upon consideration, I now think this is the expected behavior because, by convention, vertex tangents exist within the surface’s plane. At least this is my understanding. Sadly, I concluded that VertexTangentZ will always return a vector equivalent to the face normal, not the vertex normal. (The clear exception is when the triangle and its surrounding triangles all sit on the same plane, in which case the vertex normal is equal to the face normal.)

As an additional test, I created a cube in Unreal, opened it in the static mesh editor, clicked Show > Tangents, and this confirms the tangents sit on the face surface plane (see image below).

However, my understanding could be wrong so please do correct me if so. Thank you for the help!

It’s part of the FStaticMeshVertexBuffers.StaticMeshVertexBuffer that you already mentioned above.

Yes, that’s what the surface normals do when you have a sharp corner. A sharp corner means that three surfaces meet, and make a crease, so the vertex gets split, one for each flat surface meeting at the vertex.

To see some smooth vertices, make and export a smooth shaded donut or somesuch. Those will have the “averaged” normals you expect.

You have not shown any code, if you want people to help you solve this problem you should post the part with the code that is not working for you.

My guess is that your vertex normals are not pointing outwards properly at the angle they need to be and there for raytrace system cannot see them maybe because of this. You should also look at the ray trace system because it has an ignore list and an allow list, and maybe the normals are on these lists or are not.

So the cube vertex normals if it’s just a face of a cube with just 4 vertices per face then the normals will not maybe point in the direction of the raytrace system because they are ajusted at 0.5 as I told you, they will not point in front of the cube maybe, forward, they will point halfway upwards.

These are soft normals and will not point towards the raytrace.

I still don’t fully understand, you tried to get normals and you could not get them or you don’t know how to get them.

You should try PMC, it has all the data open once it becomes a procedural mesh, and to your trace system you can specify variables that are open inside the procedural mesh system.

For procedural system the normals has a whole array you can call and you can specify with your trace system depending how you have it set up, If trace system hits “var” whatever then do whatever.

So everything opens up in PMC and becomes availible
This is where the normals are stored for example.

TArray < FVector > normals;

So as you can see the vertex normals are a vector.
If you have a mesh sections, perfect, you load the whole mesh section into these kind of arrays with this library. UKismet library it will load all the mesh things, tangents, normals, etc and now you got easy access to normals with your trace system.

It’s possible.
Review this thread.

So from what I can see the user here ray traced the normals from what I see.

In this simple example I generate 4 blocks on top of each other from bottom to top (randomly picked “rock” or “dirt”). The floor is a chrome plane for reflection. As you can see, the second Mesh Section (“dirt”) has correct Normals in “World Normal” view, but grey(?) Normals in “Ray Tracing World Normal” View. Thus, the reflections in the left image are somehow missing. SSR work fine though (probably since the normal pass is correct).

It’s also possible with blue prints to raytrace stuff in PMC.

@jwatte @GigiK-K-K Thanks for your thoughtful suggestions.

@jwatte You are correct and I was wrong. VertexTangentZ does indeed provide the vertex normal. Evidently, the issue I’m having relates to the Blender mesh I was testing as a raycast impact surface.

I have now compared how my Pawn traverses over (1) a sphere from Unreal and (2) a UV sphere from Blender. On the Unreal sphere the vertex normals are reported correctly by VertexTangentZ and I get smooth traversal using the barycentric coordinates to interpolate between vertex normals. In contrast, on the Blender sphere all vertex normals from VertexTangentZ point in the same direction, which causes abrupt orientation snapping when traversing between faces on the mesh.

What would you suggest to correctly get geometry from Blender into Unreal? Relevant details are below.

Here are the two spheres side by side:

The Unreal sphere shows different vertex normals for each vertex of a hit face (expected behavior):

The Blender UV sphere shows vertex normals pointing in the same direction (not expected or desired):

I believe that the issue relates to how I am exporting meshes from Blender and/or importing into Unreal. Here are my settings:

Code snippet:

// Get buffers for vertex positions and vertex normals
FTransform ComponentTransform = StaticMeshComp->GetComponentTransform();

FStaticMeshVertexBuffers* VertexBuffers = &StaticMesh->GetRenderData()->LODResources[0].VertexBuffers;
FStaticMeshVertexBuffer* StaticMeshVertexBuffer = &VertexBuffers->StaticMeshVertexBuffer;
FPositionVertexBuffer* PositionVertexBuffer = &VertexBuffers->PositionVertexBuffer;

FIndexArrayView IndexBuffer = StaticMesh->GetRenderData()->LODResources[0].IndexBuffer.GetArrayView();

// Storage for the triangle vertex positions and vertex normals
FVector VertexPositions[3];
FVector VertexNormals[3];

uint32 index0 = IndexBuffer[FaceIndex * 3 + 0];
VertexPositions[0] = FVector(PositionVertexBuffer->VertexPosition(index0));
VertexNormals[0] = FVector(StaticMeshVertexBuffer->VertexTangentZ(index0));

uint32 index1 = IndexBuffer[FaceIndex * 3 + 1];
VertexPositions[1] = FVector(PositionVertexBuffer->VertexPosition(index1));
VertexNormals[1] = FVector(StaticMeshVertexBuffer->VertexTangentZ(index1));

uint32 index2 = IndexBuffer[FaceIndex * 3 + 2];
VertexPositions[2] = FVector(PositionVertexBuffer->VertexPosition(index2));
VertexNormals[2] = FVector(StaticMeshVertexBuffer->VertexTangentZ(index2));

// Transform vertices and vertex normals from local space to world space
VertexPositions[0] = ComponentTransform.TransformPosition(VertexPositions[0]);
VertexNormals[0] = ComponentTransform.TransformVector(VertexNormals[0]);

VertexPositions[1] = ComponentTransform.TransformPosition(VertexPositions[1]);
VertexNormals[1] = ComponentTransform.TransformVector(VertexNormals[1]);

VertexPositions[2] = ComponentTransform.TransformPosition(VertexPositions[2]);
VertexNormals[2] = ComponentTransform.TransformVector(VertexNormals[2]);

// Store normal vector based on barycentric interpolation
FVector BaryNormal;

// Get barycentric coordinates
FVector b = FMath::ComputeBaryCentric2D(ImpactPoint, VertexPositions[0], VertexPositions[1], VertexPositions[2]);

BaryNormal.X = b.X * VertexNormals[0].X + b.Y * VertexNormals[1].X + b.Z * VertexNormals[2].X;
BaryNormal.Y = b.X * VertexNormals[0].Y + b.Y * VertexNormals[1].Y + b.Z * VertexNormals[2].Y;
BaryNormal.Z = b.X * VertexNormals[0].Z + b.Y * VertexNormals[1].Z + b.Z * VertexNormals[2].Z;

// Normalize

// Set hover height above surface
SetActorLocation(ImpactPoint + BaryNormal * HoverHeight);

// Set rotation to match surface normal
FVector cp = FVector::CrossProduct(ActorUpVector, BaryNormal);
float dp = FVector::DotProduct(ActorUpVector, BaryNormal);

float s = FMath::Sqrt((1.0 + dp) * 2.0);
float rs = 1.0 / s;

FQuat q(cp.X * rs, cp.Y * rs, cp.Z * rs, s * 0.5);

q = q * ActorQuat;

This is what smoothing groups are for. Make sure you export with correct smoothing groups, and use those smoothing groups on import to generate smooth normals.

@jwatte Thanks, that fixed it.

For anyone coming to this thread in the future the steps are as follows.

  1. Create a UV Sphere in Blender.
  2. Right click on the sphere and select Shade Smooth.
  3. Export as an fbx file from Blender using the settings given in the post above.
  4. Import into Unreal using the settings in the post above.
  5. Add the mesh to the scene. In the Outliner panel find the mesh, right click and select Edit <mesh name> (ctrl + E).
  6. In the static mesh editor check Allow CPUAccess. In the Collision menu click Remove Collision then click Auto Convex Collision. There might be other collision settings that work well. I haven’t tested it.