With the Neo Kinect plugin you can use the Kinect v2 sensor advanced capabilities within Unreal Engine, with easy to use Blueprint nodes, all nicely commented, or directly through the C++ methods. Take a look at the Quick Start Guide to see how it works and have an idea of what you can achieve with it!
Robust and fast
The plugin was created with performance and usability in mind, so you can track all 6 possible users, their faces and enable all of the Kinect’s frame types (color, depth, infrared etc) at the same time with almost no hit in performance at all. Sensor polling is made in its own thread and a custom texture type was created just for the high-res-realtime updates, which is still compatible with the material editor system. If you need to, there are even functions to access the textures pixel values.
No need for components
The sensor is unique, no matter how many Actors or Widgets are using it. So, instead of needing to add components or extend specific Blueprints, you just call functions like with a function library. That way you can control the device from any Blueprint, including Widgets.
Advanced Remapping
Besides access to the standard Microsoft Kinect API coordinate remapping methods, the plugin also comes with other remapping features that facilitate AR applications, like getting the location of a joint in the Color frame without losing its depth information. Every location and orientation was adapted to Unreal’s coordinate system and Joints transforms are compatible with the Engine’s Mannequin character rig.
Fully production proven
I’ve used Neo Kinect a lot (more than a year) before releasing to the public and removed all bugs found so far, besides making a lot of performance improvements. It was used in applications that go through a whole day without crashing and packages without problems.
Technical Details
Body tracking
- Tracking of up to 6 simultaneous users’s skeletons, with 25 joints each
- Users leaning angle, tracking confidence, Body edge clipping, hands states
- Per Body found/lost events
Face tracking
- Location and orientation of up to 6 simultaneous users’s faces
- Face points (left and right eyes, nose and left and right mouth corners) in 3D and 2D (Color and Infrared space)
- Faces bounding boxes in Color and Infrared frames space
- Expressions reading (Engaged, Happy, Looking Away, Mouth Moved, Mouth Open and Left and Right Eyes Open/Closed) and if users are Wearing glasses or not
- Per Face found/lost events
Sensor control
- Global bodies/faces tracking events (found/lost)
- Init/Uninit sensor
- Get sensor tilt, ground plane normal and sensor height
Remapping
- 3D camera location to Color texture (optionally with depth) and to Depth texture
- Find depth of a Color texture location
- Depth point to Color point and to 3D location
Frames (textures)
- Get each frame FOV and dimensions
- Toggle frames usage individually
- Sample a pixel value from the Depth frame and find the depth of a Color pixel
Network Replicated: No
Platform: Win64 only
Quick Start guide: NeoKinect-QuickStart.pdf
Example Project for Unreal Engine 5: NeoKinectExamples.zip.
- This example is not yet using the new demo room from UE5, only the new skeleton (the UE4 one is still there as well).
Example Project for Unreal Engine 4: NeoKinectExamples_UE4.zip.
- Updated AvateeringDemo Blueprint with joint smoothing for UE 4.26+ (replace the asset on the example project).
FAQ
Q: The Avateering AnimBP doesn’t work for my skeleton. Orientations look wrong. How can I fix it? Can I use retargeting?
A: You’ll need to compute “retargeting” manually. Instructions here.
Q: How do I reduce jittering on my tracking?
A: For UE4, use the updated example AvateeringBP found here. It uses linear interpolation to interpolate the joints’ transforms from one frame to the the next, smoothing out big random changes in movement. The UE5 demo is already updated with that functionality.
Q: My packaged game crashes on launch. How do I fix it?
A: Check if you copied the runtime face dlls from the Kinect SDK redist
folder into your packaged game. Specific instructions are in the Quick Start pdf that comes with the plugin.
Q: Will Neo Kinect support Azure?
A: No. If I create an Azure plugin (no plans for it currently), that would be its own plugin since the API is different from Kinect.
Q: Kinect is no longer in production. For how long will Neo Kinect be supported?
A: I’ll keep it up to date with the engine for as long as there are people buying it, which means there’s still interest in using Kinect v2 with Unreal.
Q: Can I use Neo Kinect for mocap?
A: I wouldn’t recommend it for mocap. Not because of the plugin, but Kinect itself.
- Kinect always thinks you’re facing it, even if you’re backwards to it, so knees and elbows become very weird. And sideways, it gets really confused about the members it can’t see.
- Then, the recording part. The plugin is made for runtime use. What I assume would be possible is to use the take recorder in Unreal for recording skeletal movements that happen in game, so you could use the Neo Kinect Avateering example for that, but I still don’t think you’d like the final animation quality because Kinect is not that precise (no matter what MS tells you).
- I’ve worked a lot with Kinect and it’s very nice for controlling things with gestures, but for mocap or AR applications, it’s not even satisfactory IMHO.
Q: How do the Avateering BPs work?
A: read this post.
Q: Any advice to create a virtual dresser?
A: Yes. Read this post.
Q: Can I use the plugin with Kinect Studio without a sensor?
A: No, but you can use pre-recorded data with one. Further details and advice here.