Generate Procedural Mesh

I’m looking for a start-to-finish example of a working C++ implementation of the new UProceduralMeshComponent. Does that exist yet or is it all scraps of information all over the place?

Hi,

Is there a way to prevent the editor from saving procedural meshes content? Before Epic’s staff integrated the custom code in the main branch, I had no problems, but after that, my maps exploded beyond 2GB for very simple meshes and the time for saving is now insane, it tooks me minutes every time I save something.

I have only procedural meshes that I generate reading an external file.

I found myself in need to create a video with matinee, but everytime I need to record, it needs to save and launch the application, and when it doesn’t crash, it takes forever. Other than that it says that there are external dependencies that will create problems at run time, or something like that.

Thanks

[=Pixel452;408939]
Is there a way to prevent the editor from saving procedural meshes content? Before Epic’s staff integrated the custom code in the main branch, I had no problems, but after that, my maps exploded beyond 2GB for very simple meshes and the time for saving is now insane, it tooks me minutes every time I save something.

I have only procedural meshes that I generate reading an external file.

I found myself in need to create a video with matinee, but everytime I need to record, it needs to save and launch the application, and when it doesn’t crash, it takes forever. Other than that it says that there are external dependencies that will create problems at run time, or something like that.

[/]

I have no experience on this topic, but I’d look for a way to make the component Transient somehow, maybe that helps.

Good tip, thanks! I will try as soon as possible and let you know.

It seems not to be working, I tried to set bTransient to true when creating the mesh object inside my custom actor class. Any other ideals?

Hey guys,

In the past I mostly did 2D Stuff and tried out Unity for some time. I am currently testing out the Unreal Engine. For the first real project I want to “port” a project I did in Unity and by port I mean completly re do it from scratch in Unreal :). the terrain I want to build should be based on a tilemap and it should look like that. My problem is, that I am simply not able to create such a terrain. In Unity I could simply create a tile from 2 triangles, so I had a plane terrain. Obviously I don’t want the tiles to be transformable, but at the beginning all of them will be flat(just for training). I treid using thiswiki.unrealengine.com/Procedural_Mesh_Generation stuff, but I could not manage to get it running in Unreal 4.10.

Can you guys give me any hints on how I could make a few simple mesh tiles?

I am creating mesh with procedural mesh component. Using’s GetStaticMeshVertexLocations function for accesing vertices.

On picture left one is static mesh and right one is created with procedural mesh component. But when i looked from different angles it shows only inside of mesh.

Do you have any idea ?

Hey,

I found a solution if someone having same problem.I changed face sides. My faces looking inside of mesh before .

[=Korcan;415203]

I am creating mesh with procedural mesh component. Using’s GetStaticMeshVertexLocations function for accesing vertices.

On picture left one is static mesh and right one is created with procedural mesh component. But when i looked from different angles it shows only inside of mesh.

Do you have any idea ?
[/]

Hey guys,

I´m new to generating procedural meshes in UE4. I already did something like that in Unity, but I´m currently trying to build a Minecraft-like Block Terrain in Unreal.
After a few days it finally worked but is currently untextured. My plan for “texturing” the Mesh was to assign different Material IDs, just like in a Static Mesh,
and use it with a different Material for every blocktype, like dirt, , etc.
I first thought of using a Texture Atlas by using the UVs of the generated Mesh, but I need to assign very different Materials, not just textures.
So my question is: Is there any way of assigning Material IDs to a UProceduralMeshComponent generated Mesh? Or is there even a better way of achieving different Materials on the surface
of my terrain?

I recently tried to use MeshInstancing instead of generating a Mesh, but it seemed fairly unoptimized to draw every whole Cube, even if just the “top-Plane” is visible.

Thanks!

Hi,

Is it in any way possible to optimize a level that might use a big amount of procedural mesh? I was thinking on creating the meshes as staticmesh in background to save memmory. I found that the new procedural mesh is pretty heavy… Anyone knows this would be possible?

Hey all, I’m having an issue with vertex winding order. Please have a look:

  1. I’m using 's UnrealJS plugingithub.com/ncsoft/Unreal.js which allows a V8 runtime
  2. I created a blueprint that has variables for vertices, triangles, normals, uvs, and sends it to a procedural mesh
  3. I extended this BP class in Javascript, and replaced the base-class’s vertices/faces etc with values generated from THREE.js

The BP

The JS code



    // Blueprint class can be subclassed!    
    class ProceduralJSMesh extends Blueprint.Load('/Game/ProceduralBox').GeneratedClass {
        // constructor
        ctor() {
            // Subobject initialization, property initialization 
            this.bAlwaysRelevant = true
            console.log( 'constructing procedural js mesh' );
        }
        
        // declare UPROPERTY's here
        // ; this.XXXXX/*[attribute+]+type*/;
        properties() {                                    
        }    
        
        ReceiveBeginPlay() {                                                                
            
            const geo = new THREE.BoxGeometry( 200, 200, 200, 1, 1, 1 );                        
            
            this.vertices = geo.faces.reduce( function( arr, f ){         
                arr.push( threeVertexToUEVertex( geo.vertices f.a ] ) );
                arr.push( threeVertexToUEVertex( geo.vertices f.b ] ) );
                arr.push( threeVertexToUEVertex( geo.vertices f.c ] ) );                
                return arr;
            }, ] );
            
            this.triangles = geo.faces.reduce( function( arr, f ){
                arr.push( f.a );
                arr.push( f.b );
                arr.push( f.c );                               
                return arr;
            }, ] );
            
            this.normals = geo.faces.reduce( function( arr, f ){                     
                arr.push( threeVertexToUEVertex( f.vertexNormals 0 ] ) );
                arr.push( threeVertexToUEVertex( f.vertexNormals 1 ] ) );
                arr.push( threeVertexToUEVertex( f.vertexNormals 2 ] ) );                    
                return arr;
            }, ] );
            
            this.uvs = geo.faceVertexUvs[0].reduce( function( arr, uvs ){                
                arr.push({
                    X: uvs[2].x,
                    Y: uvs[2].y
                });                
                return arr;                
            }, ] );
       
            super.ReceiveBeginPlay();
            
            console.log( 'vertices length', this.vertices.length );
            console.log( 'triangles length', this.triangles.length );
            console.log( 'normals length', this.normals.length );                                                  
        }     


The result:

It seems like the vertex ordering is different from THREE.js to Unreal Engine. I’m not really sure why this is, I’ve tried flipping the vertices around, c/b/a, c/a/b, b/a/c, etc… tried basically every combination and still couldn’t get the correct vertex ordering. It also seems like the triangles are not selecting the correct vertices.

So I tried another experiment, much simpler this time.

Assuming we have vertices here:





 0,0+---------------------------+200,0
   ++                              +
   | ++                            |
   |   ++                          |
   |     ++                        |
   |       ++                      |
   |         ++                    |
   |           ++                  |
   |             ++                |
   |               ++              |
   |                 ++            |
   |                   ++          |
   |                     ++        |
   |                       ++      |
   |                         ++    |
   |                           ++  |
   +-------------------------------+
0,200                          200,200





We can construct the faces like so:



            geo.vertices.push( new THREE.Vector3(0,0,0), new THREE.Vector3(0,200,0), new THREE.Vector3(200,200,0) );
            geo.vertices.push( new THREE.Vector3(0,0,0), new THREE.Vector3(200,0,0), new THREE.Vector3(200,200,0) );
            geo.faces.push( new THREE.Face3( 0, 1, 2 ) );
            geo.faces.push( new THREE.Face3( 3, 4, 5 ) );


However here’s what happens:

If we flip the vertices like this (note the second set of vertices, the 2nd and 3rd are swapped)



            geo.vertices.push( new THREE.Vector3(0,0,0), new THREE.Vector3(0,200,0), new THREE.Vector3(200,200,0) );
            geo.vertices.push( new THREE.Vector3(0,0,0), new THREE.Vector3(200,200,0), new THREE.Vector3(200,0,0) );


Then we’ve got the correct order?

I don’t really understand what’s happening here. Note that I’m still very beginner at UE so please bare with me. I need help!

Is there anything I can check between the verts to ensure the correct vertex ordering or normal facing?

I think it’s better to use Quads , try this



 0,0-----------------------200,0
  (0)                         (1)
   |                          |
   |                          | 
   |                          |
 (3) ------------------------(2)
0,200                         200,200

geo.vertices.push( new THREE.Vector3(0,0,0), new THREE.Vector3(200,0,0), new THREE.Vector3(200,200,0),new THREE.Vector3(0,200,0) );
geo.faces.push( new THREE.Face3( 0, 1, 3 ) );
geo.faces.push( new THREE.Face3( 1, 2, 3 ) );



I’ve got two questions. Trying to get UProceduralMeshComponent recognized by Visual 2015 with UE 4.10.

First, I’ve setup my build file with the following:



        PublicIncludePaths.AddRange(new string] { "ProceduralMeshComponent/Public", "ProceduralMeshComponent/Classes" });
        PublicDependencyModuleNames.AddRange(new string] { "Core", "CoreUObject", "Engine", "InputCore", "ProceduralMeshComponent" });
	PrivateDependencyModuleNames.AddRange(new string] { "ProceduralMeshComponent" });


However, I still get the IDE error ‘cannot open source file “ProceduralMeshComponent.h”’ when attempting to include the header in my Actor header. My reference to UProceduralMeshComponent also shows an error: ‘identifier “UProceduralMeshComponent” is undefined’. Nonetheless, the project seems to compile successfully. I’m wondering if there is a way to resolve these references? Has anything changed in recent versions?

Second, was the problem ever solved for preventing the mesh from being serialized?

Thanks in advance.

Hello ,

This thread has been of a lot of use for me as I’m using the CustomMeshComponents to manually generate meshes I extract from Alembic files. However, I’m facing a small problem that I hope you’ll be able to help me with.

My meshes aren’t smooth enough : http://puu.sh/lJ25H/2578d51567.jpg . The problem is that thus far, I only fetch vertices and UVs from my Alembic files, not smoothing normals or anything like that. How could I properly smooth my model, I looked for a way to access normals but no luck thus far, in UCustomMeshComponent.

Thanks in advance !

I’ve been able to get THREE.js geometry into Unreal Engine. Seems wild, I’ve been able to do THREE.js procedural mesh easily for a while, so now seeing it in Unreal Engine is kind of mind blowing.

Here’s the route.

  1. You have to install UnrealJSforums.unrealengine.com/showthread.php?92022-Unreal-js
  2. Create an actor that looks like: Note that it has properties like vertices, faces, uvs, etc. which will be replaced by an inheriting class. The sequence can obviously be omitted that was just me debugging stuff.
  3. Create a JS script like /// <reference path="typings/ue.d.ts">/>(function (global) { "use s - Pastebin.com Note that I’m using NPM to install THREE.js so it’s AMD/CommonJS compatible. Here, we take a look at the Class we’re inheriting from the blueprint we’ve created:

class ProceduralJSMesh extends Blueprint.Load('/Game/ProceduralBox').GeneratedClass {
        ctor() {
            // Subobject initialization, property initialization 
            this.bAlwaysRelevant = true
            console.log( 'constructing procedural js mesh' );
        }
        
        ReceiveBeginPlay() {                                                                
          
            const geo = new THREE.TorusKnotGeometry(200, 40, 128, 24, 2, 3, 1);

            this.vertices = geo.vertices.map( function( vertex ){
                return threeVertexToUEVertex( vertex );
            });

            this.triangles = geo.faces.reduce( function( arr, f ){
                arr.push( f.c );
                arr.push( f.b );
                arr.push( f.a );                               
                return arr;
            }, ] );
            
            this.normals = geo.faces.reduce( function( arr, f ){                     
                arr.push( threeVertexToUEVertex( f.vertexNormals 0 ] ) );
                arr.push( threeVertexToUEVertex( f.vertexNormals 1 ] ) );
                arr.push( threeVertexToUEVertex( f.vertexNormals 2 ] ) );                    
                return arr;
            }, ] );

            this.uvs = geo.faceVertexUvs 0 ].reduce( function( arr, uvs ){
                uvs.forEach( function( vertex ){
                    arr.push( {
                        X: vertex.x,
                        Y: vertex.y
                    } );
                }); 
                return arr;
            }, ] );
       
            super.ReceiveBeginPlay();
                                                          
        }
    }

First thing to note is we’re generating geometry on ReceiveBeginPlay. There’s probably a good reason for this, as I think the BP is fully realized in its inheritance stack, since doing the procgen on the constructor doesn’t work.

Second thing to note is that THREE.js vertices are {v.x, v.y, v.z}, whereas Unreal is {v.X, v.Y, v.Z} so I wrote a small function to convert that (check it in the pastebin).

Finally, note the geometry format of THREE.js. Each vertex maps to another vertex from THREE to Unreal, so we can just cycle down the vertex array. The faces will have to be iterated through, pushing the face indices one by one, except that it’s flipped (cba instead of abc). The normals are also pushed, as well as the UVs. The parent class will take care of tangent generation (actually required?)

  1. Finally, you’ll need an actor with a v8 (javascript/unrealjs) component in your Unreal scene. This will trigger running the JS code which will instantiate a class that your procgen geometry exists in. Unreal then lights it and does the rest properly.

What’s amazing about working like this is, well, not only do you get a lot of flexibility from JS, you also get live reloading. That’s right, whenever I hit save, the geometry instantaneously updates. I’ve actually no clue how it’s doing that, or what triggers that. I assume what’s going on is UnrealJS is doing some of the live updating of the V8 instance, and there’s an actor-destroy when exiting the JS code so that it just kills the actor, and creates a new one in its place. Pretty rad.

Why am I doing this? Because a majority of my game (and its procgen) is already written in JS, so I needed a way to save LOTS of work. Not only can I run my game in JS while being in UE4, I can also port my procgen code directly which is amazing.

@mflux Great post! BTW, I think that BP class isn’t necessary. It can be implemented/replaced with JS only. You can create a proc gen component inside your class. :slight_smile:
I also think that it is possible to publish your some three.js - unreal.js code as a module in NPM, so others can refer it by ‘npm i three-ue4’.

[=crush1983;429904]

Second, was the problem ever solved for preventing the mesh from being serialized?

[/]

I do this, in header file:



public:
	UPROPERTY(Transient, DuplicateTransient)
	UProceduralMeshComponent* Mesh = nullptr;

	virtual void OnConstruction(const FTransform& Transform) override;

protected:
	void BeginPlay() override;


Then in c++, to handle construction in editor, we create mesh during OnConstruction; to handle construction during gameplay, we create mesh in BeginPlay.

Note that ‘OnConstruction’ fires in quite a few occasions, in editor – on each frame while moving a prop in Editor, when compiling blueprint, etc. You’ll want to test if Mesh == nullptr and only construct it when needed.

Special case to consider is when blueprint containing procedural mesh is added as ChildActor to another blueprint; in that case OnConstruction (with initialized mesh to nullptr) also fires on child actors each time property changes on parent actor.

@mflux you procedural mesh generator can be executed in editor by checking ‘activate in editor’. :slight_smile:

nako_sung where is this activate in editor you speak of?

@ioFlows Studios JavascriptComponent(embedded in ‘Actor’ placed in all test levels) has bActiveWithinEditor which is false by default. Unfortunately there is no working example for this setting.