Generate Procedural Mesh

Did anyone of you had any issues with lighting?
I have a world generated of cubes. Every cube can have 4 faces with 4 vertices each. If I have a sun (directional light) that is moving over my sky and at some time moves beneath the world the top should be dark (it should be night). But now I have the problem that the bottom faces (or other faces that are visible from below) are enlighted and so I have in the bottom corners the black unlit vertices from the “overworld” and at the same position the enlighted vertices from “underworld”. And thoses two enlightes vertices are rendered. So I get an awkward result:
a155d8dcf0c77ed40278fe0ceaa28d5f6e3a604c.jpeg
The green lines belong to the faces that are enlighted from below. The expected result would be completely black, because the sun is not visible.

Does anyone has an idea how to fix this problem? Usually I would decide which vertices should be rendered with the help of the face normals, but I have no idea where in my code I could approach this problem. The material is pretty simple. It’s just a two-sided material which is rendered with a base color (green in this case).

I have a similar problem and no solution.

Maybe you could just turn the light intensity down to zero via blueprint once the sun is below the cubes.

Are you also having this lighting issue with point and spot lights?

[=;226358]
Are you also having this lighting issue with point and spot lights?
[/]

Yes, I have generally the problem that my generated meshes bleed light. Say I have a generated wall and attach a light to it, then it looks fine - one side is illuminated by the light and the other side is dark. However, if I put a mesh of some sort on the dark side of the wall then the light affects it as if the wall was not there…

So, what I do is that I never use point lights, only spot lights and Directional lights. It does not really look as good as I would hope, but I also do not have the time nor the skill to find a solution. Maybe it has to do with the generated lightmap or maybe my geometry has holes or whatever - the engine somehow does not like it ^^

I’m having a similar issue. You can see the light bleeding into the cabinet as the camera is pulled back.

https://.com/watch?v=MnJWbElI114

Edit: adjusting the shadow bias and shadow filter sharpen for the point light seemed to fix my issue. Now I just need to figure out how to bake out lightmaps for a procedural mesh.

Playing around with the shadow parameters does not work for me (the edges get smaller, but they do not vanish).

I don’t have any edge bleeding with mine, but I am also getting the normal in the shader.

[=mordentral;226948]
I don’t have any edge bleeding with mine, but I am also getting the normal in the shader.
[/]

Do you mean that you are using a normal mapped material? Are you still having lighting issues without getting the normal?

I have the Tagent Space Normal box unchecked for the material and get the normal in the shader. Also it’s been months since I actually generated tangents for the mesh so I don’t know if I ever had light bleeding or not. However any mix of double sided material/single sided/disconnecting the normal I try right now can’t reproduce the issues that you guys are having.

Can you post an image of your material? Not having to calculate tangents would save me a ton of headache.

derivednormal.jpg&stc=1&d=1419127040

Output of the cross goes to the normal input for the material, to use normal maps from textures you have to mask out the B channel for them and add the RG channels to the cross output. This of course is all being done on my end because I am triplaner texturing a landscape mesh, you might want to stick with calculated tangents since you seem to be generating static interior geometry and won’t have as much raw data to pass back and forth to the GPU as I do.

I have a couple of meshes that may benefit from your technique. Is there an extreme performance cost of calculating 2 derivatives every frame vs calculating the tangents once at build time?

It will be slightly slower of course in the shader but if you are sending lots of data to the gpu over and over or want to reduce vertice count (indices can share verts then as the normal isn’t stored per vert) then it is worth it.

What are you using as the inputs to ddx and ddy? Since I’m looking at fixing the lighting on my meshes anyways, I was noticing that flipping the green channel on my normal maps gives me the correct lighting on my generated meshes. I’m using thisalgorithm, to calculate my tangents.

[=JohnnyBeans78;224072]
I am stumped. I cannot get this working in 4.7. As I have said the collision mesh exists but it does not render. I am going to start fresh and do the tutorial step by step to see if I can get it working then add in the other stuff and see where it breaks (U.V.'s, vertex colors, smooth meshing, …). I am digging the VR features of 4.7 too much to stay in 4.6 any longer. It is time to bite the bullet.
[/]

I’ve had the exact same problem since the update from 4.6.1 to 4.7. I’d be super glad if somebody had a solution…

[=;228742]
What are you using as the inputs to ddx and ddy? Since I’m looking at fixing the lighting on my meshes anyways, I was noticing that flipping the green channel on my normal maps gives me the correct lighting on my generated meshes. I’m using thisalgorithm, to calculate my tangents.
[/]

My apologies I forgot to SS that part, it uses the Absolute World Position as the input to both.

Hi!! Sorry for my bad english ) (its not my native lang).

I was edit the code and include smoothings groups support for triangles.


USTRUCT(BlueprintType)
struct ProceduralMeshTriangle
{
    GENERATED_USTRUCT_BODY()

        UPROPERTY(EditAnywhere, Category = Triangle)
        ProceduralMeshVertex Vertex0;

    UPROPERTY(EditAnywhere, Category = Triangle)
        ProceduralMeshVertex Vertex1;

    UPROPERTY(EditAnywhere, Category = Triangle)
        ProceduralMeshVertex Vertex2;

  **  *UPROPERTY(EditAnywhere, Category = Triangle)
        int32 SmoothingsGroups;***
};

and in to sceneproxy class


      **TArray<FVector> PositionEq;
        TArray<int32> SmoothingGroupEq;**

        // Add each triangle to the vertex/index buffer
        for (int TriIdx = 0; TriIdx < Component->ProceduralMeshTris.Num(); TriIdx++)
        {
            FProceduralMeshTriangle& Tri = Component->ProceduralMeshTris[TriIdx];

            const FVector Edge01 = (Tri.Vertex1.Position - Tri.Vertex0.Position);
            const FVector Edge02 = (Tri.Vertex2.Position - Tri.Vertex0.Position);

            const FVector TangentX = Edge01.SafeNormal();
            const FVector TangentZ = (Edge02 ^ Edge01).SafeNormal();
            const FVector TangentY = (TangentX ^ TangentZ).SafeNormal();

            int32 VIndex;
            ***int32 SG;   //SmoothigGrup Buffer

            SG = Tri.SmoothingsGroups;***

            FDynamicMeshVertex Vert0;
            Vert0.Position = Tri.Vertex0.Position;
            Vert0.Color = Tri.Vertex0.Color;
            Vert0.SetTangents(TangentX, TangentY, TangentZ);
            Vert0.TextureCoordinate.Set(Tri.Vertex0.U, Tri.Vertex0.V);
            ***if (!PositionEq.Contains(Vert0.Position))
            {
                SmoothingGroupEq.Add(Tri.SmoothingsGroups);
                PositionEq.Add(Vert0.Position);
                VIndex = VertexBuffer.Vertices.Add(Vert0);
                IndexBuffer.Indices.Add(VIndex);
            } else 
                {
                if ((SmoothingGroupEq[PositionEq.Find(Vert0.Position)] == SG))
                    {
                        IndexBuffer.Indices.Add(PositionEq.Find(Vert0.Position));
                    } else    
                        {
                            SmoothingGroupEq.Add(Tri.SmoothingsGroups);
                            PositionEq.Add(Vert0.Position);
                            VIndex = VertexBuffer.Vertices.Add(Vert0);
                            IndexBuffer.Indices.Add(VIndex);
                        }
                }***

            FDynamicMeshVertex Vert1;
            Vert1.Position = Tri.Vertex1.Position;
            Vert1.Color = Tri.Vertex1.Color;
            Vert1.SetTangents(TangentX, TangentY, TangentZ);
            Vert1.TextureCoordinate.Set(Tri.Vertex1.U, Tri.Vertex1.V);
            ***if (!PositionEq.Contains(Vert1.Position))
            {
                SmoothingGroupEq.Add(Tri.SmoothingsGroups);
                PositionEq.Add(Vert1.Position);
                VIndex = VertexBuffer.Vertices.Add(Vert1);
                IndexBuffer.Indices.Add(VIndex);
            } else
                {
                    if ((SmoothingGroupEq[PositionEq.Find(Vert1.Position)] == SG))
                    {
                    IndexBuffer.Indices.Add(PositionEq.Find(Vert1.Position));
                    } else
                        {
                        SmoothingGroupEq.Add(Tri.SmoothingsGroups);
                        PositionEq.Add(Vert1.Position);
                        VIndex = VertexBuffer.Vertices.Add(Vert1);
                        IndexBuffer.Indices.Add(VIndex);
                        }
                }***
            FDynamicMeshVertex Vert2;
            Vert2.Position = Tri.Vertex2.Position;
            Vert2.Color = Tri.Vertex2.Color;
            Vert2.SetTangents(TangentX, TangentY, TangentZ);
            Vert2.TextureCoordinate.Set(Tri.Vertex2.U, Tri.Vertex2.V);
            ***if (!PositionEq.Contains(Vert2.Position))
            {
                SmoothingGroupEq.Add(Tri.SmoothingsGroups);
                PositionEq.Add(Vert2.Position);
                VIndex = VertexBuffer.Vertices.Add(Vert2);
                IndexBuffer.Indices.Add(VIndex);
            } else
                {
                    if ((SmoothingGroupEq[PositionEq.Find(Vert2.Position)] == SG))
                    {
                        IndexBuffer.Indices.Add(PositionEq.Find(Vert2.Position));
                    } else
                        {
                            SmoothingGroupEq.Add(Tri.SmoothingsGroups);
                            PositionEq.Add(Vert2.Position);
                            VIndex = VertexBuffer.Vertices.Add(Vert2);
                            IndexBuffer.Indices.Add(VIndex);
                        }
                }***
        }


And another news: I use procedural mesh in BP and make the procedural box, but whith tesselation. U can build box with x,y,z dimensions and t polygons in each side of box.
https://drive.google.com/file/d/0ByXahhyru0l-VURLWWUxaW9mZWs/view?usp=sharing

But this thing have some problems.

  1. This BP based on construction script and in viewport work fine. But when i press the play button my mesh disappear.
  2. This mesh contain vertex color. But vertex paint not work. I was founded in staticmeshcomponent.h struct Fpaintedvertex and thinking about including this code to the procedural mesh. But im a begginer…
  3. Next step what i want to make - add material id to the fproceduraltriangles.

One week ago i was maked a BP wich build procedural house from static mesh clusters. This BP have input parameters: dimensions, number of floors, number of window and doors for each wall. And i can use vertex paint and diffrent materials. And it work fine. BUT. When i build a new house for example 5-9 floors and 10x15 meters dimensions - this BP contain 1000 static meshes. FPS down form 120 to 30. When I add a whole building of the same number of polygons from the 3d max - its ok whith FPS (120).

Thats why i need procedural mesh - generate one mesh for one building or one floor.

[=TaurusI76;228819]
I’ve had the exact same problem since the update from 4.6.1 to 4.7. I’d be super glad if somebody had a solution…
[/]

Guys make sure that you actually implement


virtual void GetDynamicMeshElements(const TArray<const FSceneView*>& Views, const FSceneViewFamily& ViewFamily, uint32 VisibilityMap, FMeshElementCollector& Collector) const override


and



virtual void DrawStaticElements(FStaticPrimitiveDrawInterface* PDI)


The DrawDynamicElements() function has been on the way out for a couple of patches and they finally enforced it in this one and it doesn’t get called at all (that is why it isn’t rendering for you).

This is actually the very first major update where I didn’t have to change a single line of code to compile my project :stuck_out_tongue:

[=mordentral;228303]

derivednormal.jpg&stc=1&d=1419127040

Output of the cross goes to the normal input for the material, to use normal maps from textures you have to mask out the B channel for them and add the RG channels to the cross output. This of course is all being done on my end because I am triplaner texturing a landscape mesh, you might want to stick with calculated tangents since you seem to be generating static interior geometry and won’t have as much raw data to pass back and forth to the GPU as I do.
[/]

Thank you for posting this. But unfortunately it does not fix my lighting problems. Currently I have no idea what I could try and it looks really awkwared if ue4 changes the shading mode for higher distances and therefore getting a bright lighed landscape in the deep of the night ;).