Depth maps Pointclouds from a texture

Hello there! First question here. Does anyone know how to pass a sample texture (render target) RGB values as horizontal, vertical and depth in a niagara system?

I have been using Touchdesigner for making pointclouds with a Kinect V2:

Specifying R, G and B values instances geometry and gives that cool pointcloud. I been trying to switch that technique to Unreal so I could have 3D characters with real time montages animation blending (which is hard to do in Touchdesigner) and mix those with live visuals of a real enviroment.
I have been looking to tutorials and I can get and Idea that I could pass the texture to niagara with a render target. I have done that with TouchEngine wich allows me to display a Touchdesigner file that streams the Kinect feed as you see in the image above (the box called “null1” shows the kinect in RGB). My issue is I havent really made it work with the tutorials so far and Im scratching my head with the many modules I could make in the niagara system haha, so I dont really know how should or how I could pass the RGB values as UVZ for the particles.

This are the tutorials I watched so far that kinda gives the result I need:

This has been the results only using touchdesigner so far (if anyone cares :laughing: ), you can see me in the pointcloud with 3D characters but changing or blending animations its quite a hassle inside touchdesigner, so if anyone can help me to move to Unreal I will apreciate it!

Thanks!

Hello @hernancrocce

You can see how this is implemented in the Nuitrack plugin (works with kinect and other sensors)