Community Tutorial: Interactive Niagara Particles with Kinect-like sensors [Nuitrack]

Depth sensors have depth data, why not use it to create great visual effects?
How to create real-time interactive point cloud

Samples Inside Easy Interactive Particles (real-time 3D motion plugin) [Nuitrack] | Community tutorial

Trailer https://youtu.be/w7Mee0fhpg8?feature=shared

This looks really cool, thanks for sharing!
I’ve been curious about ways to bring depth-sensor data into Unreal for more interactive visuals, and using it with Niagara particles opens up a ton of creative possibilities. The point cloud approach in real time is especially interesting — feels like it could be used for both artistic installations and gameplay mechanics.

Appreciate the sample link and trailer, I’ll definitely check those out. Quick question: does this setup work with other depth sensors besides Nuitrack-supported ones, or is it fairly locked into that ecosystem?

1 Like

Hello @paedyn90

Thanks for your interest. It only works with sensors that Nuitrack officially supports. But you need to understand that this is not a finite list and it is constantly being updated https://nuitrack.notion.site/87e45f2fb76c4456973f826dc1583ebc?v=332b96533bae44688c01cbc0b88baf7f

If you need a specific sensor, you can try to make a request support-nuitrack@3divi.com