Hi, all!
A lot of the tutorials I’ve seen have involved sketching out some terrain, and then projecting some meshes onto it and copying the impact point’s alignment.
Can a similar effect be achieved without requiring the terrain? For example, if I generate some points using spatial noise, can I use those to calculate the would-be normals at each point?
For example, I’m able to calculate some locations and spawn in some meshes, but I’m unsure how to get them to align correctly.
Fig 1: Point altitude determined by spatial noise and some multiplier
Fig 2: Debug visualization of points
Fig 3: The meshes have been spawned in. How to align them?
Ok. So I found a fairly cumbersome solution, but it works. Is there a faster way?
- Create a points grid
- In my case, I created a grid with a cell size of 100
- Create multiple transformations of the points in the grid. One for each direction of adjacency.
- In my case, I translated 100 units to the north, south, east, and west.
- I suspect if you were using a hex grid, you would have a transformation for 12, 2, 4, 6, 8, and 10 o’clock
- Apply spatial noise to each of the transformed sets of points. Make sure you use the same seed in each case.
- Use the $position value for each of the transformed point sets as a neighbor position on the original set of points.
- In my case, my points have attributes for NorthNeighborPosition, EastNeighborPosition, etc…
- Use vector subtraction to get the location delta to each neighbor point.
- In my case, my points have attributes for NorthDelta, EastDelta, etc…
- Going clockwise, use the cross product of consecutive delta vectors to calculate the normal of the polygon they would form.
- In my case NorthDelta x EastDelta yields NorthEastPolyNormal, EastDelta x SouthDelta yields SouthEastPolyNormal, etc …
- Average the normals of the adjacent polys to get a normal vector for the point.
- Use the point’s normal vector to create its new rotation.
1 Like