Neo Kinect - easy access to the Kinect v2 capabilities in your games

With the Neo Kinect plugin you can use the Kinect v2 sensor advanced capabilities within Unreal Engine, with easy to use Blueprint nodes, all nicely commented, or directly through the C++ methods. Take a look at the Quick Start Guide to see how it works and have an idea of what you can achieve with it!

Robust and fast

The plugin was created with performance and usability in mind, so you can track all 6 possible users, their faces and enable all of the Kinect’s frame types (color, depth, infrared etc) at the same time with almost no hit in performance at all. Sensor polling is made in its own thread and a custom texture type was created just for the high-res-realtime updates, which is still compatible with the material editor system. If you need to, there are even functions to access the textures pixel values.

No need for components

The sensor is unique, no matter how many Actors or Widgets are using it. So, instead of needing to add components or extend specific Blueprints, you just call functions like with a function library. That way you can control the device from any Blueprint, including Widgets.

Advanced Remapping

Besides access to the standard Microsoft Kinect API coordinate remapping methods, the plugin also comes with other remapping features that facilitate AR applications, like getting the location of a joint in the Color frame without losing its depth information. Every location and orientation was adapted to Unreal’s coordinate system and Joints transforms are compatible with the Engine’s Mannequin character rig.

Fully production proven

I’ve used Neo Kinect a lot (more than a year) before releasing to the public and removed all bugs found so far, besides making a lot of performance improvements. It was used in applications that go through a whole day without crashing and packages without problems.

Technical Details

Body tracking

  • Tracking of up to 6 simultaneous users’s skeletons, with 25 joints each
  • Users leaning angle, tracking confidence, Body edge clipping, hands states
  • Per Body found/lost events

Face tracking

  • Location and orientation of up to 6 simultaneous users’s faces
  • Face points (left and right eyes, nose and left and right mouth corners) in 3D and 2D (Color and Infrared space)
  • Faces bounding boxes in Color and Infrared frames space
  • Expressions reading (Engaged, Happy, Looking Away, Mouth Moved, Mouth Open and Left and Right Eyes Open/Closed) and if users are Wearing glasses or not
  • Per Face found/lost events

Sensor control

  • Global bodies/faces tracking events (found/lost)
  • Init/Uninit sensor
  • Get sensor tilt, ground plane normal and sensor height

Remapping

  • 3D camera location to Color texture (optionally with depth) and to Depth texture
  • Find depth of a Color texture location
  • Depth point to Color point and to 3D location

Frames (textures)

  • Get each frame FOV and dimensions
  • Toggle frames usage individually
  • Sample a pixel value from the Depth frame and find the depth of a Color pixel

Network Replicated: No
Platform: Win64 only

Quick Start guide: NeoKinect-QuickStart.pdf

Example Project for Unreal Engine 5: NeoKinectExamples.zip.

  • This example is not yet using the new demo room from UE5, only the new skeleton (the UE4 one is still there as well).

Example Project for Unreal Engine 4: NeoKinectExamples_UE4.zip.

FAQ

Q: The Avateering AnimBP doesn’t work for my skeleton. Orientations look wrong. How can I fix it? Can I use retargeting?
A: You’ll need to compute “retargeting” manually. Instructions here.

Q: How do I reduce jittering on my tracking?
A: For UE4, use the updated example AvateeringBP found here. It uses linear interpolation to interpolate the joints’ transforms from one frame to the the next, smoothing out big random changes in movement. The UE5 demo is already updated with that functionality.

Q: My packaged game crashes on launch. How do I fix it?
A: Check if you copied the runtime face dlls from the Kinect SDK redist folder into your packaged game. Specific instructions are in the Quick Start pdf that comes with the plugin.

Q: Will Neo Kinect support Azure?
A: No. If I create an Azure plugin (no plans for it currently), that would be its own plugin since the API is different from Kinect.

Q: Kinect is no longer in production. For how long will Neo Kinect be supported?
A: I’ll keep it up to date with the engine for as long as there are people buying it, which means there’s still interest in using Kinect v2 with Unreal.

Q: Can I use Neo Kinect for mocap?
A: I wouldn’t recommend it for mocap. Not because of the plugin, but Kinect itself.

  • Kinect always thinks you’re facing it, even if you’re backwards to it, so knees and elbows become very weird. And sideways, it gets really confused about the members it can’t see.
  • Then, the recording part. The plugin is made for runtime use. What I assume would be possible is to use the take recorder in Unreal for recording skeletal movements that happen in game, so you could use the Neo Kinect Avateering example for that, but I still don’t think you’d like the final animation quality because Kinect is not that precise (no matter what MS tells you).
  • I’ve worked a lot with Kinect and it’s very nice for controlling things with gestures, but for mocap or AR applications, it’s not even satisfactory IMHO.

Q: How do the Avateering BPs work?
A: read this post.

Q: Any advice to create a virtual dresser?
A: Yes. Read this post.

Q: Can I use the plugin with Kinect Studio without a sensor?
A: No, but you can use pre-recorded data with one. Further details and advice here.

6 Likes

Hello. I want to get a texture there is only body pixel, Alpha = 1 where there’s a body, 0 otherwise.

I can’t believe the whole thread is gone!!
@VictorLerp any idea what happened here? Several messages, all images… everything’s gone but the OP.

Hi, RVillani! I’ve backed up a couple of pages of your thorough answers, so I can send you them if you will. BTW, will you update plugin to be compatible with UE 5?

1 Like

That’s great! Yes, please and thank you :smiley:

I will, yeah. I intend to make it compatible with DX12 as well.

1 Like

That’s good news, thank you!
I’ve PM’d you original pages - give me a note if you didn’t receive them.

Hi RVillani, I’ve been using your plugin in a lot of projects with great results, but I was wondering if you were planning on integrating support for the new Azure Kinect or make another plugin dedicated to it :slight_smile: currently there is a plugin but it’s not been updated after 4.25 and I’d love to work on the new hardware with a stable and supported plugin.
Thanks!

1 Like

Hi, Linton!
I’m happy to know you like the plugin :smiley:
For now, I have no plans of making a plugin for Kinect Azure, though. I couldn’t get one in Brazil and now that I’m living in Canada, I would have very little time to work on it.
Cheers

Hi Rodrigo,

Thanks for your plugin, it looks amazing and robust.

I am about to buy it for a project using Kinect v2 for a AR experience where users will take control of an invisible skeleton with niagara particle systems attached to joints (mostly the arms since we want the users to turn in some kind of feather person). The particles will be feathers.

We also would eventually like to bind wing models to their arms. The global idea is to turn them into some kind of bird.

The plugin looks it could exactly do what we need, but I have stumbled onto this comment of yours in the answers on the plugin’s marketplace page:

What do you mean? Because it sounds to me that what we need is mocap features and that your plugin provides exactly that :slight_smile:

Thank you!

Hi!
Sorry for the delay. I formatted my PC and then phone recently and still haven caught up to setting up everything as before.

If the user will always be facing the screen, it should work just fine for your purposes (and I’d like to see that, it sounds awesome!).

What I mean when I say Kinect is not good for mocap is that it’s bad with sideways and back-facing capture. So full mocap, where you can do anything and the software will record correctly, is not feasible with it.

1 Like

Understood :slight_smile: thanks for your answer

Yes our user will face the screen so that should do it. Perfect. Will let you know when the project is completed and show you how it looks like.

Hello,
im seing bad rotations on manequin shoulders and a bit on the elbow’s, to be clear i already asked this question and the author was kind enough to give me an approach to fix rotations using “make rot from ZX” given the assumption that my character has different bone rotations than those in the ue4 manequin but
the problem is i see the same problem in the example project (ue4) with the manequin (attached image )

it happens with any other character and with any user testing the game.

i would like to know if im the only one getting this issue , maybe my kinect is giving this weird rotations for shoulders ?

Could you please explain how to implement an other skeletal mesh? I tried but I cant get it to work and I guess I dont understand the workflow. The skeletal mesh is twisted and damaged every time. I tried DAZ characters and metahumans.Thank you !

hi,in ar color mapping,can i mapping more than one person?

您好,问一下现在您实例里面的小白人模型的动作是矛盾的我需要这么去调整(我抬左手,模型也抬左手,我需要抬模型抬右手,这样才符合虚拟试衣镜的效果)

Plugin seem very complete, but how to use it to make a virtual dresser ? Do we have to put a plane with a material with video color with the avatar in front or do we have to use a widget or something else ?

How to get the result you showed in the presentation images ?

Hello, how can I get a texture? Only body pixels, alpha=1, otherwise 0.

Hi, check the first item on the FAQ section of the first post. I added that section recently, with explanations for most commonly asked questions and yours is the very first.

Yes. The plugin works with up to 6 people at once.

That’s right. You need to flip your mesh’s Y axis scale (-1) or use a post process material to mirror the whole screen.