Hello there! First question here. Does anyone know how to pass a sample texture (render target) RGB values as horizontal, vertical and depth in a niagara system?
I have been using Touchdesigner for making pointclouds with a Kinect V2:
Specifying R, G and B values instances geometry and gives that cool pointcloud. I been trying to switch that technique to Unreal so I could have 3D characters with real time montages animation blending (which is hard to do in Touchdesigner) and mix those with live visuals of a real enviroment.
I have been looking to tutorials and I can get and Idea that I could pass the texture to niagara with a render target. I have done that with TouchEngine wich allows me to display a Touchdesigner file that streams the Kinect feed as you see in the image above (the box called “null1” shows the kinect in RGB). My issue is I havent really made it work with the tutorials so far and Im scratching my head with the many modules I could make in the niagara system haha, so I dont really know how should or how I could pass the RGB values as UVZ for the particles.
This are the tutorials I watched so far that kinda gives the result I need:
This has been the results only using touchdesigner so far (if anyone cares ), you can see me in the pointcloud with 3D characters but changing or blending animations its quite a hassle inside touchdesigner, so if anyone can help me to move to Unreal I will apreciate it!
Thanks!