Generate Procedural Mesh

@JamesG, it would appear that LineTraceComponent doesn’t work in Packaged builds for me but works fine in the editor. Have you run into this? I’m creating my component in C++, setting it to enable collision QueryAndPhysics, collisionobjecttype ECC_PhysicsBody or ECC_Visibility, CollsionResponseToAllChannels ECR_Block. This code seems to work perfectly in the editor but neither a LineTraceComponent in C++ or in Blueprints works in a packaged build but works fine in the Editor. I’ve created an AnswerHub for this here: https://answers.unrealengine.com/questions/232679/48p3-uproceduralmeshcomponents-linetracecomponent.html

const auto playerController = GetWorld()->GetFirstPlayerController();
const auto cameraManager = playerController->PlayerCameraManager;
const FCollisionQueryParams LineParams(FName(TEXT(“ComponentIsVisibleFrom”)), true, NULL);
FHitResult OutHitResult;
bool hit = procmesh ->LineTraceComponent(OutHitResult, cameraManager->GetCameraLocation(), cameraManager->GetCameraLocation() + (cameraManager->GetCameraRotation().Vector()*100000.0), LineParams);

There are a few physics related errors that pop up at runtime and at editing time that might be related:

LogPhysics:Error: Attempt to build physics data for /Game/Maps/Map.Map:PersistentLevel.MyObject_C_0.procMeshComponent.BodySetup_3 when we are unable to. This platform requires cooked packages.

LogActor:Warning: Natively added component (MyObject_C_67.DefaultSceneRoot) was left unattached from the actor’s root.
LogPhysics:Warning: PHYSX: …\PhysXExtensions\src\ExtRigidBodyExt.cpp (217) 4 : computeMassAndInertia: Dynamic actor with illegal collision shapes
LogPhysics:Warning: PHYSX: …\PhysXExtensions\src\ExtRigidBodyExt.cpp (266) 4 : PxRigidBodyExt::updateMassAndInertia: Mass and inertia computation failed, setting mass to 1 and inertia to (1,1,1)

@SiggiG:
how do you have converted your code from the old ProceduralMeshComponent class to the new ProceduralMeshComponent ?

the method “SetProceduralMeshTriangles” doesn’t exist anymore and i can’t find a workaround :-/ maybe i have to work with the new method “CreateMeshSection”, but google results for that are only one…

or anybody else any experience with the new ? maybe code example, tutorial or important hints? :wink:

You should look through this thread, especially JamesG’s post with the screenshot from how to do it in blueprints and also have a look at the source code (it’s very well commented and easy to understand) and the helper functions. There is also an example function to create a simple box mesh. Very helpful is also the “Go to code definition” if you rightclick a blueprint node.

As far as I have examined the new PMC - and I haven’t ported it over yet - you can create a PMC and within that PMC you can create multiple sections, each one is treated as a single mesh and you can assign a different material to each mesh/section.
To create a section you need an array of vertices and an array of triangles( Optional you can provide an array of the UVs, an array of the tangents and an array of the normals. There is a function to create the tangents and the normals for you if you have the vertices, triangles and normals). With those arrays you can then call the function CreateMeshSection and provide an index for the section. If the section has already been created the old “mesh data” will be overriden and updated with the new data.
If I’m wrong anyhwere please correct me, that’s just what I came up with so far from looking at the source.

One question though, CalculateTangentsForMesh is commented with “UVs are required for correct tangent generation.”, but in the implementation there is a way to generate them without the UVs. So how or in which way will those tangents be incorrect?

[=algu;298655]
@SiggiG:
how do you have converted your code from the old ProceduralMeshComponent class to the new ProceduralMeshComponent ?

the method “SetProceduralMeshTriangles” doesn’t exist anymore and i can’t find a workaround :-/ maybe i have to work with the new method “CreateMeshSection”, but google results for that are only one…

or anybody else any experience with the new ? maybe code example, tutorial or important hints? :wink:
[/]

Instead of working with triangle structs you now work with the vertices array and define triangles using an index buffer that refers to the vertex array.

For example to create one triangle you would put in 3 vectors into the vertices array and then an array of [0,1,2] to the Triangles one. You can even make triangles re-use the same vertex points, which has pros and cons (different methods useful for different things).

And yeah the CreateMeshSection is the one you need.

When you got that working you need to learn about calculating your normals and tangents :slight_smile: I’ll see if I can put together a simple example soon, like drawing a cylinder.

An example procedural cylinder

Here is a quick example on how to use the new ProceduralMeshComponent in 4.8.

I based this off a more complex method I’m using in my own hobby project, so please excuse the somewhat sloppy code. I have a lot more cached/pre-calculated in my own code, but wanted to keep this example as simple as I could, and added more comments.

This example should demonstrate all the main things you need to take care of when building your own mesh and I hope someone finds it useful!

SimpleCylinderActor.h


// A simple procedural cylinder example
// 27. May 2015 - Sigurdur G. Gunnarsson

#pragma once

#include "GameFramework/Actor.h"
#include "ProceduralMeshComponent.h"
#include "SimpleCylinderActor.generated.h"

UCLASS()
class ASimpleCylinderActor : public AActor
{
	GENERATED_BODY()
	
public:	
	ASimpleCylinderActor();

	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Cylinder Parameters")
	float Radius = 10;

	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Cylinder Parameters")
	float Height = 20;

	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Cylinder Parameters")
	int32 CrossSectionCount = 10;

	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Cylinder Parameters")
	bool bCapEnds = true;

	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Cylinder Parameters")
	bool bDoubleSided = false;

	UPROPERTY(EditAnywhere, BlueprintReadWrite, Category = "Cylinder Parameters")
	bool bSmoothNormals = true;

	virtual void OnConstruction(const FTransform& Transform) override;
	virtual void BeginPlay() override;
	
private:
	void GenerateMesh();

	UPROPERTY(VisibleAnywhere, Category = Materials)
	UProceduralMeshComponent* mesh;

	void GenerateCylinder(TArray<FVector>& Vertices, TArray<int32>& Triangles, TArray<FVector>& Normals, TArray<FVector2D>& UVs, TArray<FProcMeshTangent>& Tangents, float Height, float InWidth, int32 InCrossSectionCount, bool bCapEnds = false, bool bDoubleSided = false, bool bInSmoothNormals = true);
};

SimpleCylinderActor.cpp - Remember to change the first include to your project’s header file!


// A simple procedural cylinder example
// 27. May 2015 - Sigurdur G. Gunnarsson

#include "ProceduralMesh01.h"
#include "SimpleCylinderActor.h"


ASimpleCylinderActor::ASimpleCylinderActor()
{
	mesh = CreateDefaultSubobject<UProceduralMeshComponent>(TEXT("ProceduralMesh"));
	RootComponent = mesh;
}

void ASimpleCylinderActor::OnConstruction(const FTransform& Transform)
{
	Super::OnConstruction(Transform);
	GenerateMesh();
}

void ASimpleCylinderActor::BeginPlay()
{
	Super::BeginPlay();
	GenerateMesh();
}

void ASimpleCylinderActor::GenerateMesh()
{
	TArray<FVector> Vertices;
	TArray<int32> Triangles;
	TArray<FVector> Normals;
	TArray<FVector2D> UVs;
	TArray<FProcMeshTangent> Tangents;
	TArray<FColor> VertexColors;

	GenerateCylinder(Vertices, Triangles, Normals, UVs, Tangents, Height, Radius, CrossSectionCount, bCapEnds, bDoubleSided, bSmoothNormals);

	mesh->ClearAllMeshSections();
	mesh->CreateMeshSection(0, Vertices, Triangles, Normals, UVs, VertexColors, Tangents, false);
}

void ASimpleCylinderActor::GenerateCylinder(TArray<FVector>& Vertices, TArray<int32>& Triangles, TArray<FVector>& Normals, TArray<FVector2D>& UVs, TArray<FProcMeshTangent>& Tangents, float InHeight, float InWidth, int32 InCrossSectionCount, bool bInCapEnds, bool bInDoubleSided, bool bInSmoothNormals)
{
	// -------------------------------------------------------
	// Basic setup
	int VertexIndex = 0;
	int32 NumVerts = InCrossSectionCount * 4; // InCrossSectionCount x 4 verts per face

	// Count extra vertices if double sided
	if (bInDoubleSided)
	{
		NumVerts = NumVerts * 2;
	}

	// Count vertices for caps if set
	if (bInCapEnds)
	{
		NumVerts += 2 * (InCrossSectionCount - 1) * 3;
	}

	// Clear out the arrays passed in
	Triangles.Reset();

	Vertices.Reset();
	Vertices.AddUninitialized(NumVerts);

	Normals.Reset();
	Normals.AddUninitialized(NumVerts);

	Tangents.Reset();
	Tangents.AddUninitialized(NumVerts);

	UVs.Reset();
	UVs.AddUninitialized(NumVerts);

	// -------------------------------------------------------
	// Make a cylinder section
	const float AngleBetweenQuads = (2.0f / (float)(InCrossSectionCount)) * PI;
	const float VMapPerQuad = 1.0f / (float)InCrossSectionCount;
	FVector Offset = FVector(0, 0, InHeight);

	// Start by building up vertices that make up the cylinder sides
	for (int32 QuadIndex = 0; QuadIndex < InCrossSectionCount; QuadIndex++)
	{
		float Angle = (float)QuadIndex * AngleBetweenQuads;
		float NextAngle = (float)(QuadIndex + 1) * AngleBetweenQuads;

		// Set up the vertices
		FVector p0 = FVector(FMath::Cos(Angle) * InWidth, FMath::Sin(Angle) * InWidth, 0.f);
		FVector p1 = FVector(FMath::Cos(NextAngle) * InWidth, FMath::Sin(NextAngle) * InWidth, 0.f);
		FVector p2 = p1 + Offset;
		FVector p3 = p0 + Offset;

		// Set up the quad triangles
		int VertIndex1 = VertexIndex++;
		int VertIndex2 = VertexIndex++;
		int VertIndex3 = VertexIndex++;
		int VertIndex4 = VertexIndex++;

		Vertices[VertIndex1] = p0;
		Vertices[VertIndex2] = p1;
		Vertices[VertIndex3] = p2;
		Vertices[VertIndex4] = p3;

		// Now create two triangles from those four vertices
		// The order of these (clockwise/counter-clockwise) dictates which way the normal will face. 
		Triangles.Add(VertIndex4);
		Triangles.Add(VertIndex3);
		Triangles.Add(VertIndex1);

		Triangles.Add(VertIndex3);
		Triangles.Add(VertIndex2);
		Triangles.Add(VertIndex1);

		// UVs
		UVs[VertIndex1] = FVector2D(VMapPerQuad * QuadIndex, 0.0f);
		UVs[VertIndex2] = FVector2D(VMapPerQuad * (QuadIndex + 1), 0.0f);
		UVs[VertIndex3] = FVector2D(VMapPerQuad * (QuadIndex + 1), 1.0f);
		UVs[VertIndex4] = FVector2D(VMapPerQuad * QuadIndex, 1.0f);

		// Normals
		FVector NormalCurrent = FVector::CrossProduct(Vertices[VertIndex1] - Vertices[VertIndex3], Vertices[VertIndex2] - Vertices[VertIndex3]).GetSafeNormal();

		if (bInSmoothNormals)
		{
			// To smooth normals you give the vertices a different normal value than the polygon they belong to, gfx hardware then knows how to interpolate between those.
			// I do this here as an average between normals of two adjacent polygons
			// TODO re-use calculations between loop iterations (do them once and cache them!), no need to calculate same values every time :)
			float NextNextAngle = (float)(QuadIndex + 2) * AngleBetweenQuads;
			FVector p4 = FVector(FMath::Cos(NextNextAngle) * InWidth, FMath::Sin(NextNextAngle) * InWidth, 0.f);

			// p1 to p4 to p2
			FVector NormalNext = FVector::CrossProduct(p1 - p2, p4 - p2).GetSafeNormal();
			FVector AverageNormalRight = (NormalCurrent + NormalNext) / 2;
			AverageNormalRight = AverageNormalRight.GetSafeNormal();

			float PreviousAngle = (float)(QuadIndex - 1) * AngleBetweenQuads;
			FVector pMinus1 = FVector(FMath::Cos(PreviousAngle) * InWidth, FMath::Sin(PreviousAngle) * InWidth, 0.f);

			// p0 to p3 to pMinus1
			FVector NormalPrevious = FVector::CrossProduct(p0 - pMinus1, p3 - pMinus1).GetSafeNormal();
			FVector AverageNormalLeft = (NormalCurrent + NormalPrevious) / 2;
			AverageNormalLeft = AverageNormalLeft.GetSafeNormal();

			Normals[VertIndex1] = AverageNormalLeft;
			Normals[VertIndex2] = AverageNormalRight;
			Normals[VertIndex3] = AverageNormalRight;
			Normals[VertIndex4] = AverageNormalLeft;
		}
		else
		{
			// If not smoothing we just set the vertex normal to the same normal as the polygon they belong to
			Normals[VertIndex1] = NormalCurrent;
			Normals[VertIndex2] = NormalCurrent;
			Normals[VertIndex3] = NormalCurrent;
			Normals[VertIndex4] = NormalCurrent;
		}

		// Tangents (perpendicular to the surface)
		FVector SurfaceTangent = p0 - p1;
		SurfaceTangent = SurfaceTangent.GetSafeNormal();
		Tangents[VertIndex1] = FProcMeshTangent(SurfaceTangent, true);
		Tangents[VertIndex2] = FProcMeshTangent(SurfaceTangent, true);
		Tangents[VertIndex3] = FProcMeshTangent(SurfaceTangent, true);
		Tangents[VertIndex4] = FProcMeshTangent(SurfaceTangent, true);

		// If double sides, create extra polygons but face the normals the other way.
		if (bInDoubleSided)
		{
			VertIndex1 = VertexIndex++;
			VertIndex2 = VertexIndex++;
			VertIndex3 = VertexIndex++;
			VertIndex4 = VertexIndex++;

			Vertices[VertIndex1] = p0;
			Vertices[VertIndex2] = p1;
			Vertices[VertIndex3] = p2;
			Vertices[VertIndex4] = p3;

			Triangles.Add(VertIndex1);
			Triangles.Add(VertIndex3);
			Triangles.Add(VertIndex4);

			Triangles.Add(VertIndex2);
			Triangles.Add(VertIndex3);
			Triangles.Add(VertIndex4);
		}

		if (QuadIndex != 0 && bInCapEnds)
		{
			// Cap is closed by triangles that start at 0, then use the points at the angles for the other corners

			// Bottom
			FVector capVertex0 = FVector(FMath::Cos(0) * InWidth, FMath::Sin(0) * InWidth, 0.f);
			FVector capVertex1 = FVector(FMath::Cos(Angle) * InWidth, FMath::Sin(Angle) * InWidth, 0.f);
			FVector capVertex2 = FVector(FMath::Cos(NextAngle) * InWidth, FMath::Sin(NextAngle) * InWidth, 0.f);

			VertIndex1 = VertexIndex++;
			VertIndex2 = VertexIndex++;
			VertIndex3 = VertexIndex++;
			Vertices[VertIndex1] = capVertex0;
			Vertices[VertIndex2] = capVertex1;
			Vertices[VertIndex3] = capVertex2;

			Triangles.Add(VertIndex1);
			Triangles.Add(VertIndex2);
			Triangles.Add(VertIndex3);

			FVector2D UV1 = FVector2D(FMath::Sin(0), FMath::Cos(0));
			FVector2D UV2 = FVector2D(FMath::Sin(Angle), FMath::Cos(Angle));
			FVector2D UV3 = FVector2D(FMath::Sin(NextAngle), FMath::Cos(NextAngle));

			UVs[VertIndex1] = UV1;
			UVs[VertIndex2] = UV2;
			UVs[VertIndex3] = UV3;

			// Top
			capVertex0 = capVertex0 + Offset;
			capVertex1 = capVertex1 + Offset;
			capVertex2 = capVertex2 + Offset;

			VertIndex1 = VertexIndex++;
			VertIndex2 = VertexIndex++;
			VertIndex3 = VertexIndex++;
			Vertices[VertIndex1] = capVertex0;
			Vertices[VertIndex2] = capVertex1;
			Vertices[VertIndex3] = capVertex2;

			Triangles.Add(VertIndex3);
			Triangles.Add(VertIndex2);
			Triangles.Add(VertIndex1);

			UVs[VertIndex1] = UV1;
			UVs[VertIndex2] = UV2;
			UVs[VertIndex3] = UV3;
		}
	}
}


Is this “procedural mesh generation / custom mesh generation” is what we should use to do “voxel” terrain generation ? or voxel is totally another thing ?

My brain cant understand the difference yet :frowning:

i want to try (for my own pleasure to create game) a minecraft-like world/terrain, but if i add many different actor with a cube mesh to build my entire world, the game run at 1fps when the camera look at the terrain (many cubes actor you know…), and max_fps if the cam dont look the terrain (0 cube actor)

i learned that i needed to use “voxel”, and if i understand, the terrain in voxel is in fact a single big mesh? (or many big mesh assembled together) ?

voxel look like magic for me, and i am not a wizard :frowning: please tell me this procedural mesh generation / custom mesh generation is a good start to do “voxel” :frowning: or try to explain me if i am wrong what is VOXEL and can we do voxel world with ue4 ?

and, is this possible in blueprint to do voxel game?

why voxel is so hard to understand :frowning:

thanks for your patience :o

If you are just looking at doing cubes then there are many ways to do that. To start off it’s going to be easier for you to use an instanced mesh component to draw a lot of box meshes. You’ll need one component to draw each type of cube (different textures).

Is there any way to support the procedural mesh for HTML5?

Hey guys,

I have a problem with the current code on . I delete all light sources, I put a sky light and when I drop it to 0, procedural mesh close of (0,0,0) are not correctly affected by light (we see it, but it is supposed to be completly black).

It’s not that visible, so I just zoom in to activate the “see in the dark” system (I don’t know how it’s called) and we can see that procedural meshes are enlightened but not the floor (static mesh)

The issue is better explained here.

I thought it can be a good place because people here are more comfortable with procedural mesh I think. Everything I made is on 4.7.6 with the current code on from the wiki https://wiki.unrealengine.com/Procedural_Mesh_Generation

Thank you!

Is there any way to suppress the warning from the procedural mesh:

LogPhysics:Warning: PHYSX: …\PhysXExtensions\src\ExtRigidBodyExt.cpp (266) 4 : PxRigidBodyExt::updateMassAndInertia: Mass and inertia computation failed, setting mass to 1 and inertia to (1,1,1)
LogPhysics:Warning: PHYSX: …\PhysXExtensions\src\ExtRigidBodyExt.cpp (217) 4 : computeMassAndInertia: Dynamic actor with illegal collision shapes

For those who are just trying to get started with this in Blueprint, like I was…

  • Create an empty actor
  • Add a procedural mesh component

In BluePrint

  • Define data to create a mesh section
  • Add a material

Thanks for sharing!

How to set up the project

Hi

How do you set up the project so #include “ProceduralMeshComponent.h” is found and UProceduralMeshComponent is defined.

I have tried adding these lines to my build.cs file:

using UnrealBuildTool;

public class first : ModuleRules
{
public first(TargetInfo Target)
{
PublicDependencyModuleNames.AddRange(new string] { “Core”, “CoreUObject”, “Engine”, “InputCore”});

    PrivateDependencyModuleNames.AddRange(new string] { "CustomMeshComponent" });
    PrivateIncludePathModuleNames.AddRange(new string] { "CustomMeshComponent" });

    PublicIncludePaths.AddRange(new string] { "CustomMeshComponent/Public", "CustomMeshComponent/Classes", "CustomMeshComponent/Private" });

Do I need Unreals ProceduralMeshComponent.h and ProceduralMeshComponent.cpp files in my projects folders?

Regards

Leesan, you should be including “ProceduralMeshComponent” instead of “CustomMeshComponent”:

Like this:

PublicDependencyModuleNames.AddRange(new string] { “ProceduralMeshComponent” });

[=SiggiG;319794]
Leesan, you should be including “ProceduralMeshComponent” instead of “CustomMeshComponent”:

Like this:

PublicDependencyModuleNames.AddRange(new string] { “ProceduralMeshComponent” });
[/]

I added this line to my build.cs and I still get an error when including the “ProceduralMeshComponent.h” into my actor class. My build.cs looks like so:

public class Game : ModuleRules
{
public Game (TargetInfo Target)
{
PublicDependencyModuleNames.AddRange(new string] { “Core”, “CoreUObject”, “Engine”, “InputCore”, “ProceduralMeshComponent” });
PrivateDependencyModuleNames.AddRange(new string] { “ProceduralMeshComponent” });
}
}

How do you access the header files? I’ve tried CustomMeshComponent and ProceduralMeshComponent. Neither of which can I include correctly into my project. I’ve tried adding the suggested lines above into my build.cs. What’s the trick here?

Thank you, but “Please Sir. May I have some more?”

Dracolytch. Thanks for this. It’s a good base to start from but I do have some questions. I am a bit confused about the relation for the single triangle’s worth of vertices in the vertex array to the grid of triangle meshes made by the utility function. Shouldn’t the array of vertices be the list of vertices for all of the triangles in the triangle array? Using this blueprint as is, just renders just a single triangle, not a square grid as I would have expected from the name of the function. Which I’m assume is because the vertex array only has a single triangle’s worth of vertices. Looking in the Triangle array after it’s been created shows it using index values such as 12, 98, etc. which don’t map to the three indices of the vertex array. Am I just missing the implication that I need to define all those vertices myself? For some reason I was expecting the grid gen function to create the vertices as well as the triangle array. I mean, how does it know which of the vertices in the array I’ve defined myself? If it’s generating a quad grid of x by y size, I would expect it to define the vertex positions as well as the triangle definitions. Am I just missing something?

I guess what is confusing me is that fact I can’t be sure what the arrays do. It’s clear that the Vertex array defines points in space, but if the triangle array references the indices of the vertex array to define a triangle, how does it work? It’s just an array of integers, so I have to assume that it relies on the order of the indices in the array to define a triangle, so the first three indices define the first triangle in the mesh, etc. At least, that’s what I THINK is going on. But wouldn’t it make more sense to use an array of a “triangle” struct with that struct containing the indices into the vertex array, as well as the normal vector, vertex colors, UVs, etc. I’m just lost here and could really use a push (shove, face kick, whatever works) in the right direction.

What would be really, really awesome is actual documentation on this. I know it’s still an experimental feature, but how are we supposed to be able to test it and advise on wether it’s a useful feature that should be expanded upon if we can’t figure out how to use it? I’ve tried searching multiple different ways and haven’t found anything more informative then this forum thread which really doesn’t have all that much info. Is it possible to get an update from Epic on the status of the docs for this feature? I really would like to explore using this component in combination with 's map gen system to make whole worlds. Even just an explanation of how to define the needed variable types and how they work together would be immensely useful. Cheers,

J^2

Answering my own questions, I have figured out most of it. But I have to ask, why was it implemented this way? Why have the triangles represented as an array of integers that map back to the vertex array? It makes operations like subdivision a huge pain compared to being able to just add/remove triangles at will. I found using the Custom Mesh’s triangle structure array much more intuitive to work with. I’m wondering why that workflow was ditched in favor of this one. Is there a technical reason (more efficient memory usage, better for the blueprint system, etc)? Because from a logical point of view, I seem to be missing any advantage in having it structured like this. Of course, I’m not a programmer so I’d love to hear from one on this.

J^2

[=J. J. Franzen;326929]
Answering my own questions, I have figured out most of it.
[/]

Great!

[=J. J. Franzen;326929]
But I have to ask, why was it implemented this way?
[/]

Using an ‘index buffer’ like this is how most stuff actually gets rendered under the hood. The big advantage is that you can re-use vertices. So if you imagine a rolling landscape, each vertex is actually re-used by about 6 triangles - it is ‘shared’ by the triangles around it. By splitting vertex information from triangle information, that vertex only has to be copied/transformed once. In the old scheme, you would have to duplicate the vertex. One you start thinking of things as “ok, which unique vertices do i need, and how do i connect them into triangles?”, I think it is actually easier to build stuff. Certainly my test Blueprints are simpler using ProcMeshComp than CustomMeshComp.