Mesh from depth cameras (spatial mapping)

Hey all! With UE being able to do AR and the tracking behind that, along with other AR platforms integrated such as apple, hololens and google. As well as the whole lidar plugin.

Is there anything out there or has anyone looked to generate mesh based on depth images from a live stream, video or other sources?
My scenario is that I know the position of the camera and have it moving around in my UE scene through other data sources and sensors (not tracked by image like AR). As I move around i’d like to generate a mesh to build over time as my camera walks through the area. Similar to Apple’s iPhone and the room scanning apps where it’s able to generate mesh as you walk around.

But because I’m only getting depth images and not using some SDK or product line from manufacturers, I was hoping there was a generic implementation of taking depth and turning it into a 3D mesh using UE’s procedural system. That way i would be able to build the seen during gameplay not just in editor mode.
Any pointers would be greatly appreciated!