Neo Kinect - easy access to the Kinect v2 capabilities in your games

This plugin can be used on Kinect v1 🔞[TUTORIAL][PLUGIN] Real-time body tracking for UE5 [Nuitrack]

3 Likes

Someone asked some questions about how the Avateering demo Blueprints work on the Marketplace and I believe the answer might be useful to others, so I’ll leave it here as well, where I can link to and it’s easier to find (I think).

How the Avateering demo Blueprints work

I’ll call BP_AvateeringDemo and BP_AvateeringDemo_UE4 the Actor BPs, and AvateeringDemo_AnimBP and AvateeringDemoUE5_AnimBP the Anim BPs.

The Actor BPs communicate with the Anim BPs using functions from the BPI_AvateeringAnimBP Interface Blueprint. Both Anim BPs implement that interface to give meaning to its functions SetJointsTransforms and GetJointsTransforms. The Actor BPs get the AnimInstance from their SkeletalMeshComponents, which is the running Anim BP for a skeleton, and use those interface functions to send Transforms to it and also to get previously sent Transforms. By doing that, the Actor BPs don’t need to know which Anim BP specifically they’re talking to, meaning you can switch to other skeletons with different Anim BPs that implement that interface and the Actor BPs would still work with those skeletons, even if they’re 6 different skeletons, as long as they use an Anim BP that implements BPI_AvateeringAnimBP.

The Actor BPs are the only ones that talk to the Kinect sensor, reading Joints Transforms data from it. Then they get the previous frame’s Transforms sent to the Anim BPs using GetJointsTransforms and interpolate with the current frame data from Kinect, every frame. Then they send the interpolated Transforms back to the Anim BP using SetJointsTransforms. That way we can do noise filtering (smoothing) on Kinect’s joints data without needing to store an extra Transforms array for each mesh to read the previous frame data. We just use the array in each Anim BP.

The Anim BPs are the only ones that manipulate the Skeletal Mesh bones. I could’ve done it from the Actor BPs using nodes like SetSocketTransform on the SkeletalMeshComponents, but that’s much worse performance-wise. Besides, using the Anim BPs to manipulate the Skeletal Mesh bones enables the usage of other Anim BP functionalities, like jiggle physics, post-process Anim BPs, Anim Layers and so on. But, to reiterate, the Anim BPs just use the transforms they’re told to. The Actor BPs are the ones talking to Kinect, interpreting its Joints Transforms, then sending them to the Anim BPs.

Advice on creating virtual dressers.

Prepare yourself because even with the features I added to the plugin just for that, it’s complex.

It’s very important that you use a skeleton with same rig as the UE mannequin. Not just same bone names, but also their orientations. Otherwise setup is harder (see first questions in this QA section).

You’ll need to activate color space joints transforms to have joints coordinates aligned with the color camera, right after you init the sensor on BeginPlay. That’s done by calling the node Set Use Joints Color Space Transforms with Use checked. Once you’ve done that, you can use the Color Location and Color Orientation properties from when you break a KinectJoint structure.

Scale the bones in the SkeletalMesh to the user’s bones, as some people are taller or shorter than others. I did it by storing the original SkeletalMesh bones’ lengths on BeginPlay, before any change was done to it. To get a bone’s length from the Skeletal Mesh, you use Get Socket Location on the two joints that make that bone (like upperarm_l and lowerarm_l for the left upper arm) and save the distance between them. I suggest saving those values in an array ordered the same as the Kinect joints. You can get an index from any Kinect joint value using Joint to Index to keep it consistent when passing the values to the AnimBP.


Notice that for the array index I used the same joint used for the “upperarm_l” in the AnimBP.

Then scale them each frame by sending the correct scales to the SkeletalMesh AnimBP. The plugin gives you the user’s bones lengths via Get Bone Length, which you can call from a tracked NeoKinectBody. What worked best for me was to scale only the axis along the bone (X for the Mannequin), to not change its thickness. Then it’s almost unnoticeable that you’re scaling it. The math is Scale = UserBoneLength / OriginalMeshBoneLength.


Scaling logic added to BP_AvateeringDemo, from the demo project.

I advise you lerp the scale value over time (Lerp(PreviousFrameValue, CurrentFrameValue, DeltaTime * Speed) because Kinect will jitter it constantly.

Finally, you’ll need to mirror the scene horizontally. Easiest way is to scale the Actor to -1 on the Y axis, but back when I did that, it would break cloth simulation. So what I did was to create a post process that would invert the U axis on the rendered scene by passing it through a OneMinus node in the material.

Using pre-recorded data via Kinect Studio

While the plugin can be used with Kinect Studio, Kinect Studio itself can not send pre-recorded data to other Kinect apps without a sensor connected to the system.

  1. Make sure there are no other apps accessing Kinect together with Unreal (like Kinect demos).
  2. Even using Kinect Studio, it only works if the Kinect sensor is connected and working. It’s annoying, but unfortunately it’s how it works. This is not a plugin limitation, it’s a Kinect Studio one. At least, mine doesn’t work without the sensor connected to the system.
  3. Make sure Kinect Studio is connected to the Kinect service. When it is, it displays real-time data from the sensor when not playing a pre-recorded file.
  4. Open your project and play. You should see the data from Kinect Studio there. You can go back and forth between real-time data and pre-recorded data by playing and stopping the playback in Kinect Studio.
    • I also noticed that, for real-time data, Kinect framerate is terrible if the Kinect Studio window is not focused.

Modifying the Anim BP to work with different skeletons

It’s like manual retargeting

Retargeting won’t work for Neo Kinect, because that’s only for animation assets. Neo Kinect is getting data from Kinect and converting it to Unreal Engine coordinate space. Then, to make things easier for avateering, I compute the transforms that move the Mannequin skeleton correctly. For that to work, the bones’ orientations are very important.

If your skeleton has the same bone names but orientations are different, you’ll have to customize the Avateering AnimBP for your bones’ orientations. The Mannequin upperarm_r bone has X pointing opposite to the bone direction, Z pointing towards the skeleton body and Y towards its back. To see those orientations, select the bone in the SK_Mannequin asset, press W to activate the translate tool and select the local transform mode in the top-right corner (first icon after the move/rotation/scale tools). If upperarm_r on your custom skeleton doesn’t match that exactly, the orientation set on the Avateering AnimBP will look wrong.

But you can still compute correct orientations yourself.

Let’s say your mesh’s upperarm_r has X pointing forward, Z towards the bone direction and Y towards the character’s ribs. This means that [X, Y, Z] on your mesh’s upperarm_r map to [-Y, Z, -X] on the Mannequin. Therefore, to create the correct rotation use the Make Rot from ZX node and pass the correct axes from the Kinect rotation for that joint. There are all combinations of axes for that node. I chose ZX here because Z is the axis along the bone. Always pick that one first. The other axis doesn’t really matter, as long as you pick the correct one from Kinect to convert. In this case, the Z axis on your mesh maps to -X on the Mannequin, so you need to get the forward vector from Kinect Right Elbow joint rotation and multiply it by -1 to invert its direction. For X, it maps to -Y on the Mannequin. So you get the rotation right axis and multiply it by -1. That would give you the correct rotation for your mesh’s upperarm_r coming out of the Make Rot from ZX node.

It will require manual work to get those correct for all bones, but I can’t have the plugin support every possible different orientation of bones. So I followed the Mannequin as a standard.

Here’s a tutorial I made about the Make Rot from [XYZ] nodes. I hope it helps.

Is there a way to change the frame dimensions coming from the Kinetic? currently is 1920x1080, but I need it portrait 1080x1920. is that do able?

Hey @RVillani
Is it possible to get data from multiple kinect devices using this plug-in?

Perhaps with some sort of modification?
I want to have 3 services placed in separate positions and track closest human to each and relay info from all 3 to same unreal instance.

I am also wondering if Kinect can adjust tracking based on its orientation or whether it has to always be set in a certain orientation?

No. Kinect doesn’t provide that option.
The ugly solution I used once was to display the Kinect camera filling the screen, with the sides cut off. It’s very sad, because you have to blow up the 1080p video feed to 1920 high. But Kinect really doesn’t have a vertical option. And if you turn the sensor, I don’t remember why exactly, but it messes with something. You could try, maybe I’m wrong. But I think it has to do with the floor and angle detection, which I used to auto-align things in 3D.

Not with the plugin as it is. When I wrote it, I didn’t realize that was possibility, so everything works via a static singleton instance of the class that controls the sensor’s functionalities. That’s even reflected on the BP functions to control it, which are all global, not requiring any specific sensor instance.

You may try to adapt it, but there will be two main issues:

  • The Unreal part of the code, which you have the source with the plugin, controls the sensor via a static singleton, as aforementioned. Meaning you’d have to change a lot of code to make it work with multiple instances of that type.
  • There’s another part of the code that, for bureaucratic reasons, I had to keep in a dll, no source provided. It’s not Unreal code and it’s pre-compiled, so you can package your project just fine, but if changes to that part are necessary, that’d be a stopper for the idea of multiple sensors. I’m sorry about that, but I had to do it like that because when I wrote the plugin I was a partner at a company. When I left, they allowed me to sell it as long as I hid that part of the code. :frowning:
    • Looking at it, I do see an issue. The function that initializes the sensor calls GetDefaultKinectSensor. So there’s no way to tell what specific sensor to get from Unreal. I’d have to change that, than change a bunch of the UE plugin code to not use global functions for everything. Or somehow make it optional to either use the default sensor for everything or have ways to control a specific instance the same way.

I have Windows 11. And Kinect SDK installation getting failed. And windows forums solutions are worthless and not working.

How can I make it work.

Other than the registry fix described here, which worked for me, it could be incompatibility with your USB chipset. I had to buy an additional USB card once because Kinect wouldn’t work with the USB ports I had on my motherboard.

I don’t know what else it could be. Check if you can get it working on other machines too, to eliminate the possibility of a defective sensor, I guess.

Villani, my issue is not kinect runtime. I have checked with Kinect Configuration verifier and everything passed. My issue is SDK V2 installation. I need DLL’s right to place in project folder to detect and work in UE according to your documentation. Sadly I don’t have " Lower filters " as mentioned in the blog :smiling_face_with_tear:

Here is the snippet I am getting error to install SDK. OS Win 11. Also tried with 10 issue was same.

[6764:2C44][2024-04-26T15:01:29]i399: Apply complete, result: 0x80004005, restart: None, ba requested restart: No
[6764:2C44][2024-04-26T15:01:33]i500: Shutting down, exit code: 0x0
[6764:2C44][2024-04-26T15:01:33]i410: Variable: VCRTx64Installed = 1
[6764:2C44][2024-04-26T15:01:33]i410: Variable: VCRTx86Installed = 1
[6764:2C44][2024-04-26T15:01:33]i410: Variable: VersionNT64 = 6.2.0.0
[6764:2C44][2024-04-26T15:01:33]i410: Variable: WixBundleAction = 4

Hey,

I’m getting a crash in 5.3.2

Happens when I open the NewKinectExamples project

I’ve added kinect.face.h, kinect.h and Kinect.INPC.H to the Unreal engine install, and Kinect20.face.dll and NuiDatabase to the NewKinectExamples project.

Unhandled Exception: 0xc06d007e

KERNELBASE
UnrealEditor_NeoKinectUnreal!__delayLoadHelper2() [D:\a_work\1\s\src\vctools\delayimp\delayhlp.cpp:312]
UnrealEditor_NeoKinectUnreal!_tailMerge_kinect20_face_dll()
UnrealEditor_NeoKinectUnreal!NeoKinect::KinectSensor::SetUseFrame()
UnrealEditor_NeoKinectUnreal!UNeoKinectManager::GetFaces() [D:\build\U5M-Marketplace\Sync\LocalBuilds\PluginTemp\HostProject\Plugins\NeoKinectUnreal\Source\NeoKinectUnreal\Private\NeoKinectManager.cpp:932]
UnrealEditor_NeoKinectUnreal!UNeoKinectManager::execGetFaces() [d:\build\U5M-Marketplace\Sync\LocalBuilds\PluginTemp\HostProject\Plugins\NeoKinectUnreal\Intermediate\Build\Win64\UnrealEditor\Inc\NeoKinectUnreal\UHT\NeoKinectManager.gen.cpp:204]
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_UnrealEd
UnrealEditor_UnrealEd
UnrealEditor_UnrealEd
UnrealEditor_UnrealEd
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_UnrealEd
UnrealEditor_UnrealEd
UnrealEditor
UnrealEditor
UnrealEditor
UnrealEditor
UnrealEditor
UnrealEditor
kernel32
ntdll

Did you got success with Kinect SDK installation ? I am facing issue from last few days. IDK how to solve. Windows Forums are helpless.

Any way for your issue. Here is a help from forums
https://answers.microsoft.com/en-us/windows/forum/all/the-exception-unknown-software-exception/96d61106-2cf2-450c-b61d-9dc2ce72f695

What’s your Windows OS Version ? And Build number ?

Yeah, I have the SDK installed. The Kinect works fine in Touch Designer. Win 11, v10.0.22631

What if you add the “Lower Filters” entry yourself?

Nick, it crashes when trying to activate the face features, indicating the dll is not where it’s expected. Are you sure you copied the dll into the Binaries\Win64 folder inside your project folder?

Unfortunately, due to Kinect being very picky about USB controllers, I can’t yet verify the callstack looks like that when the dlls are missing. The USB card I used for Kinect stopped working, so I’m waiting for a new one from Amazon, and hoping it works with Kinect v2. If I can’t figure that out, I won’t be able to continue keeping the plugin up to date. :frowning:

Hey,

Here’s a screen grab of the where the kinect20.Face.dll lives. Unreal made a 5.3 copy of the project when I opened it, could that cause any issues?

I just need body tracking for my project, can I disable the face features if this cant be resolved?

I’m not entirely sure you can disable it, but I think you can try to not access the functionality.

I believe it’s crashing when you try to PIE on the Examples level, right? If so, remove the BP_FaceTracking Actor from the level. That’s the one using the face features.

At any rate, that looks like the correct file in the correct folder. Can you debug-run the project from a coding IDE to see if there’s any info about the dll on the callstack? Maybe the address where it’s trying to load it from. Perhaps your instance of the project is not looking for it in Binaries\Win64 for some reason.

Do you use some shortcut to launch the project or are you double-clicking the .uproject file? Shortcuts can change the startup folder of an application, which is where the application looks for dlls and other things automatically. I’m shooting in the dark at this point, but maybe…

I’ll only be able to test this myself to check my missing dll error against yours once the new USB card arrives. IF Kinect decides to work with it :melting_face:

Got it working! Deleting the face example did the job.

So annoying that is so poorly supported now, the Kinect has a lot of interesting functionality for interactive art project.