Creating a room floor in Augmented Reality using procedural mesh

Hi All

I am trying to create a room floor using the procedural mesh component. I want this in an AR app, where the user touches the screen to lay down vertex co-ordinates and the presses a button to generate a mesh based on those vertices. I can get this working when the user places three vertices but after that I’m lost on how to script UE4 into using more than three vertices to create a square for instance, then up to a more complicated room surface.

In this first part I am getting the touch co-ordinates from the finger and using the Apple ARkit to get real world co-ordinates.

In the second image I am then storing those co-ordinates into an array that is then driving the triangle array and UV array, both of which do not work. I am also spawning an object at the locations to show a position has been stored.

Lastly I am calling a custom event that creates the mesh based on the vertices arrays I have stored, through a simple button on a widget.

I have looked at a few things that go through the proper indexes you should use when creating a square etc. but nothing that would allow me to scale this to a more complicated surface.

any help would be appreciated.

Hi did you manage to solve the problem.? I’m trying to do the same thing using the handheld ar template, but can’t figure it out.

up !!! Creating a room floor in Augmented Reality using procedural mesh !!

Up Up Up !!

@jerobarraco

Up Up Up !

But why? This has been explored fully now:

https://www.youtube.com/results?search_query=ue4+procedural+mesh

I’ve tried very hard to find an example of this and couldn’t. and as of today this is something that i still need.

So i went and i tried my best at an implementation. There’s a lot of room for optimization.
Also i couldn’t get it to work with procedural meshes. the same exact code produces no mesh, so i’m using custom mesh.

I’ve created a material based on the one in "GoogleARCoreContent/ARCoreOcclusionMaterial " just so i can adjust the brightness. this seems to be a problem with mobile hdr.
OnBegin i create a dynamic material and store a reference.

onTick i call a node called “UpdateARCoreOcclusionMaterialInstance” and pass the reference to that dynamic material, then

  1. call ClearCustomMeshTriangles
  2. Update the triangles
  3. SetCustomMeshTriangles again

To update the triangles
Call GetAllPlanes, then iterate them.
For each i call GetBoundaryPolygonInLocalSpace and store that
and for each of the polygons, i iterate from index =2, to index = length-1
and call “MakeCustomMeshTriangle” and add it to a list to pass to “SetCustomMeshTriangles” later.
With the points in the polygon as this:

Vertex0 = Plane.GetLocalToWorldTransform().TransformLocation(Polygon[0])

Vertex1 = Plane.GetLocalToWorldTransform().TransformLocation(Polygon[i-1])

Vertex2 = Plane.GetLocalToWorldTransform().TransformLocation(Polygon[i])

This is basically a fan triangulation.
I know this is not optimized in many ways but as a proof of concept it works.

It seems that these nodes are more recommended than using the passthrough camera nodes.

Note that the material and update node is not present on master branch (4.26ish) but there are other nodes. i havent tried them but GetARTexture and “UpdateCameraTextureParam” could be the ones.

Got a lot from this: Dynamic shadows · Issue #49 · google-ar/arcore-unreal-sdk · GitHub
ARCore UE4.19 Passthrough Camera Material Brighter than BG Camera - XR Development - Unreal Engine Forums

This is basically what i’ve done, though it took me some time to (find and) glue the pieces AR Mesh Occlusion Solutions - #8 by CosmicLobster - Mobile Development - Unreal Engine Forums