Neo Kinect - easy access to the Kinect v2 capabilities in your games

With the Neo Kinect](Neo Kinect in Code Plugins - UE Marketplace) plugin you can use the Kinect v2 sensor advanced capabilities within Unreal Engine, with easy to use Blueprint nodes, all nicely commented, or directly through the C++ methods. Take a look at the Quick Start Guide]( to see how it works and have an idea of what you can achieve with it!

Robust and fast

The plugin was created with performance and usability in mind, so you can track all 6 possible users, their faces and enable all of the Kinect’s frame types (color, depth, infrared etc) at the same time with almost no hit in performance at all. Sensor polling is made in its own thread and a custom texture type was created just for the high-res-realtime updates, which is still compatible with the material editor system. If you need to, there are even functions to access the textures pixel values.
No need for components

The sensor is unique, no matter how many Actors or Widgets are using it. So, instead of needing to add components or extend specific Blueprints, you just call functions like with a function library. That way you can control the device from any Blueprint, including Widgets.
Advanced Remapping

Besides access to the standard Microsoft Kinect API coordinate remapping methods, the plugin also comes with other remapping features that facilitate AR applications, like getting the location of a joint in the Color frame without losing its depth information. Every location and orientation was adapted to Unreal’s coordinate system and Joints transforms are compatible with the Engine’s Mannequin character rig.
Fully production proven

I’ve used Neo Kinect a lot (more than a year) before releasing to the public and removed all bugs found so far, besides making a lot of performance improvements. It was used in applications that go through a whole day without crashing and packages without problems.
Technical Details

Body tracking:

  • Tracking of up to 6 simultaneous users’s skeletons, with 25 joints each
  • Users leaning angle, tracking confidence, Body edge clipping, hands states
  • Per Body found/lost events

Face tracking:

  • Location and orientation of up to 6 simultaneous users’s faces
  • Face points (left and right eyes, nose and left and right mouth corners) in 3D and 2D (Color and Infrared space)
  • Faces bounding boxes in Color and Infrared frames space
  • Expressions reading (Engaged, Happy, Looking Away, Mouth Moved, Mouth Open and Left and Right Eyes Open/Closed) and if users are Wearing glasses or not
  • Per Face found/lost events

Sensor control:

  • Global bodies/faces tracking events (found/lost)
  • Init/Uninit sensor
  • Get sensor tilt, ground plane normal and sensor height


  • 3D camera location to Color texture (optionally with depth) and to Depth texture
  • Find depth of a Color texture location
  • Depth point to Color point and to 3D location

Frames (textures):

  • Get each frame FOV and dimensions
  • Toggle frames usage individually
  • Sample a pixel value from the Depth frame and find the depth of a Color pixel

Network Replicated: No
Platform: Win64 only

Example Project:
Quick Start guide: NeoKinect-QuickStart.pdf


Hello. I want to get a texture there is only body pixel, Alpha = 1 where there’s a body, 0 otherwise.

I can’t believe the whole thread is gone!!
@VictorLerp any idea what happened here? Several messages, all images… everything’s gone but the OP.

Hi, RVillani! I’ve backed up a couple of pages of your thorough answers, so I can send you them if you will. BTW, will you update plugin to be compatible with UE 5?

1 Like

That’s great! Yes, please and thank you :smiley:

I will, yeah. I intend to make it compatible with DX12 as well.

1 Like

That’s good news, thank you!
I’ve PM’d you original pages - give me a note if you didn’t receive them.

Hi RVillani, I’ve been using your plugin in a lot of projects with great results, but I was wondering if you were planning on integrating support for the new Azure Kinect or make another plugin dedicated to it :slight_smile: currently there is a plugin but it’s not been updated after 4.25 and I’d love to work on the new hardware with a stable and supported plugin.

1 Like

Hi, Linton!
I’m happy to know you like the plugin :smiley:
For now, I have no plans of making a plugin for Kinect Azure, though. I couldn’t get one in Brazil and now that I’m living in Canada, I would have very little time to work on it.

Hi Rodrigo,

Thanks for your plugin, it looks amazing and robust.

I am about to buy it for a project using Kinect v2 for a AR experience where users will take control of an invisible skeleton with niagara particle systems attached to joints (mostly the arms since we want the users to turn in some kind of feather person). The particles will be feathers.

We also would eventually like to bind wing models to their arms. The global idea is to turn them into some kind of bird.

The plugin looks it could exactly do what we need, but I have stumbled onto this comment of yours in the answers on the plugin’s marketplace page:

What do you mean? Because it sounds to me that what we need is mocap features and that your plugin provides exactly that :slight_smile:

Thank you!

Sorry for the delay. I formatted my PC and then phone recently and still haven caught up to setting up everything as before.

If the user will always be facing the screen, it should work just fine for your purposes (and I’d like to see that, it sounds awesome!).

What I mean when I say Kinect is not good for mocap is that it’s bad with sideways and back-facing capture. So full mocap, where you can do anything and the software will record correctly, is not feasible with it.

1 Like

Understood :slight_smile: thanks for your answer

Yes our user will face the screen so that should do it. Perfect. Will let you know when the project is completed and show you how it looks like.