This plugin can be used on Kinect v1 đ[TUTORIAL][PLUGIN] Real-time body tracking for UE5 [Nuitrack]
Someone asked some questions about how the Avateering demo Blueprints work on the Marketplace and I believe the answer might be useful to others, so Iâll leave it here as well, where I can link to and itâs easier to find (I think).
How the Avateering demo Blueprints work
Iâll call BP_AvateeringDemo
and BP_AvateeringDemo_UE4
the Actor BPs, and AvateeringDemo_AnimBP
and AvateeringDemoUE5_AnimBP
the Anim BPs.
The Actor BPs communicate with the Anim BPs using functions from the BPI_AvateeringAnimBP
Interface Blueprint. Both Anim BPs implement that interface to give meaning to its functions SetJointsTransforms
and GetJointsTransforms
. The Actor BPs get the AnimInstance
from their SkeletalMeshComponents
, which is the running Anim BP for a skeleton, and use those interface functions to send Transforms
to it and also to get previously sent Transforms
. By doing that, the Actor BPs donât need to know which Anim BP specifically theyâre talking to, meaning you can switch to other skeletons with different Anim BPs that implement that interface and the Actor BPs would still work with those skeletons, even if theyâre 6 different skeletons, as long as they use an Anim BP that implements BPI_AvateeringAnimBP
.
The Actor BPs are the only ones that talk to the Kinect sensor, reading Joints Transforms
data from it. Then they get the previous frameâs Transforms
sent to the Anim BPs using GetJointsTransforms
and interpolate with the current frame data from Kinect, every frame. Then they send the interpolated Transforms
back to the Anim BP using SetJointsTransforms
. That way we can do noise filtering (smoothing) on Kinectâs joints data without needing to store an extra Transforms
array for each mesh to read the previous frame data. We just use the array in each Anim BP.
The Anim BPs are the only ones that manipulate the Skeletal Mesh bones. I couldâve done it from the Actor BPs using nodes like SetSocketTransform
on the SkeletalMeshComponents
, but thatâs much worse performance-wise. Besides, using the Anim BPs to manipulate the Skeletal Mesh bones enables the usage of other Anim BP functionalities, like jiggle physics, post-process Anim BPs, Anim Layers and so on. But, to reiterate, the Anim BPs just use the transforms theyâre told to. The Actor BPs are the ones talking to Kinect, interpreting its Joints Transforms
, then sending them to the Anim BPs.
Advice on creating virtual dressers.
Prepare yourself because even with the features I added to the plugin just for that, itâs complex.
Itâs very important that you use a skeleton with same rig as the UE mannequin. Not just same bone names, but also their orientations. Otherwise setup is harder (see first questions in this QA section).
Youâll need to activate color space joints transforms to have joints coordinates aligned with the color camera, right after you init the sensor on BeginPlay. Thatâs done by calling the node Set Use Joints Color Space Transforms
with Use
checked. Once youâve done that, you can use the Color Location
and Color Orientation
properties from when you break a KinectJoint
structure.
Scale the bones in the SkeletalMesh to the userâs bones, as some people are taller or shorter than others. I did it by storing the original SkeletalMesh bonesâ lengths on BeginPlay, before any change was done to it. To get a boneâs length from the Skeletal Mesh, you use Get Socket Location
on the two joints that make that bone (like upperarm_l
and lowerarm_l
for the left upper arm) and save the distance between them. I suggest saving those values in an array ordered the same as the Kinect joints. You can get an index from any Kinect joint value using Joint to Index
to keep it consistent when passing the values to the AnimBP.
Notice that for the array index I used the same joint used for the âupperarm_lâ in the AnimBP.
Then scale them each frame by sending the correct scales to the SkeletalMesh AnimBP. The plugin gives you the userâs bones lengths via Get Bone Length
, which you can call from a tracked NeoKinectBody
. What worked best for me was to scale only the axis along the bone (X for the Mannequin), to not change its thickness. Then itâs almost unnoticeable that youâre scaling it. The math is Scale = UserBoneLength / OriginalMeshBoneLength
.
Scaling logic added to
BP_AvateeringDemo
, from the demo project.
I advise you lerp the scale value over time (Lerp(PreviousFrameValue, CurrentFrameValue, DeltaTime * Speed
) because Kinect will jitter it constantly.
Finally, youâll need to mirror the scene horizontally. Easiest way is to scale the Actor to -1 on the Y axis, but back when I did that, it would break cloth simulation. So what I did was to create a post process that would invert the U axis on the rendered scene by passing it through a OneMinus
node in the material.
Using pre-recorded data via Kinect Studio
While the plugin can be used with Kinect Studio, Kinect Studio itself can not send pre-recorded data to other Kinect apps without a sensor connected to the system.
- Make sure there are no other apps accessing Kinect together with Unreal (like Kinect demos).
- Even using Kinect Studio, it only works if the Kinect sensor is connected and working. Itâs annoying, but unfortunately itâs how it works. This is not a plugin limitation, itâs a Kinect Studio one. At least, mine doesnât work without the sensor connected to the system.
- Make sure Kinect Studio is connected to the Kinect service. When it is, it displays real-time data from the sensor when not playing a pre-recorded file.
- Open your project and play. You should see the data from Kinect Studio there. You can go back and forth between real-time data and pre-recorded data by playing and stopping the playback in Kinect Studio.
- I also noticed that, for real-time data, Kinect framerate is terrible if the Kinect Studio window is not focused.
Modifying the Anim BP to work with different skeletons
Itâs like manual retargeting
Retargeting wonât work for Neo Kinect, because thatâs only for animation assets. Neo Kinect is getting data from Kinect and converting it to Unreal Engine coordinate space. Then, to make things easier for avateering, I compute the transforms that move the Mannequin skeleton correctly. For that to work, the bonesâ orientations are very important.
If your skeleton has the same bone names but orientations are different, youâll have to customize the Avateering AnimBP for your bonesâ orientations. The Mannequin upperarm_r
bone has X pointing opposite to the bone direction, Z pointing towards the skeleton body and Y towards its back. To see those orientations, select the bone in the SK_Mannequin asset, press W to activate the translate tool and select the local transform mode in the top-right corner (first icon after the move/rotation/scale tools). If upperarm_r
on your custom skeleton doesnât match that exactly, the orientation set on the Avateering AnimBP will look wrong.
But you can still compute correct orientations yourself.
Letâs say your meshâs upperarm_r
has X pointing forward, Z towards the bone direction and Y towards the characterâs ribs. This means that [X, Y, Z] on your meshâs upperarm_r
map to [-Y, Z, -X] on the Mannequin. Therefore, to create the correct rotation use the Make Rot from ZX
node and pass the correct axes from the Kinect rotation for that joint. There are all combinations of axes for that node. I chose ZX here because Z is the axis along the bone. Always pick that one first. The other axis doesnât really matter, as long as you pick the correct one from Kinect to convert. In this case, the Z axis on your mesh maps to -X on the Mannequin, so you need to get the forward vector from Kinect Right Elbow joint rotation and multiply it by -1 to invert its direction. For X, it maps to -Y on the Mannequin. So you get the rotation right axis and multiply it by -1. That would give you the correct rotation for your meshâs upperarm_r
coming out of the Make Rot from ZX
node.
It will require manual work to get those correct for all bones, but I canât have the plugin support every possible different orientation of bones. So I followed the Mannequin as a standard.
Hereâs a tutorial I made about the Make Rot from [XYZ]
nodes. I hope it helps.
Is there a way to change the frame dimensions coming from the Kinetic? currently is 1920x1080, but I need it portrait 1080x1920. is that do able?
Hey @RVillani
Is it possible to get data from multiple kinect devices using this plug-in?
Perhaps with some sort of modification?
I want to have 3 services placed in separate positions and track closest human to each and relay info from all 3 to same unreal instance.
I am also wondering if Kinect can adjust tracking based on its orientation or whether it has to always be set in a certain orientation?
No. Kinect doesnât provide that option.
The ugly solution I used once was to display the Kinect camera filling the screen, with the sides cut off. Itâs very sad, because you have to blow up the 1080p video feed to 1920 high. But Kinect really doesnât have a vertical option. And if you turn the sensor, I donât remember why exactly, but it messes with something. You could try, maybe Iâm wrong. But I think it has to do with the floor and angle detection, which I used to auto-align things in 3D.
Not with the plugin as it is. When I wrote it, I didnât realize that was possibility, so everything works via a static singleton instance of the class that controls the sensorâs functionalities. Thatâs even reflected on the BP functions to control it, which are all global, not requiring any specific sensor instance.
You may try to adapt it, but there will be two main issues:
- The Unreal part of the code, which you have the source with the plugin, controls the sensor via a static singleton, as aforementioned. Meaning youâd have to change a lot of code to make it work with multiple instances of that type.
- Thereâs another part of the code that, for bureaucratic reasons, I had to keep in a dll, no source provided. Itâs not Unreal code and itâs pre-compiled, so you can package your project just fine, but if changes to that part are necessary, thatâd be a stopper for the idea of multiple sensors. Iâm sorry about that, but I had to do it like that because when I wrote the plugin I was a partner at a company. When I left, they allowed me to sell it as long as I hid that part of the code.
- Looking at it, I do see an issue. The function that initializes the sensor calls
GetDefaultKinectSensor
. So thereâs no way to tell what specific sensor to get from Unreal. Iâd have to change that, than change a bunch of the UE plugin code to not use global functions for everything. Or somehow make it optional to either use the default sensor for everything or have ways to control a specific instance the same way.
- Looking at it, I do see an issue. The function that initializes the sensor calls
I have Windows 11. And Kinect SDK installation getting failed. And windows forums solutions are worthless and not working.
How can I make it work.
Other than the registry fix described here, which worked for me, it could be incompatibility with your USB chipset. I had to buy an additional USB card once because Kinect wouldnât work with the USB ports I had on my motherboard.
I donât know what else it could be. Check if you can get it working on other machines too, to eliminate the possibility of a defective sensor, I guess.
Villani, my issue is not kinect runtime. I have checked with Kinect Configuration verifier and everything passed. My issue is SDK V2 installation. I need DLLâs right to place in project folder to detect and work in UE according to your documentation. Sadly I donât have " Lower filters " as mentioned in the blog
Here is the snippet I am getting error to install SDK. OS Win 11. Also tried with 10 issue was same.
[6764:2C44][2024-04-26T15:01:29]i399: Apply complete, result: 0x80004005, restart: None, ba requested restart: No
[6764:2C44][2024-04-26T15:01:33]i500: Shutting down, exit code: 0x0
[6764:2C44][2024-04-26T15:01:33]i410: Variable: VCRTx64Installed = 1
[6764:2C44][2024-04-26T15:01:33]i410: Variable: VCRTx86Installed = 1
[6764:2C44][2024-04-26T15:01:33]i410: Variable: VersionNT64 = 6.2.0.0
[6764:2C44][2024-04-26T15:01:33]i410: Variable: WixBundleAction = 4
Hey,
Iâm getting a crash in 5.3.2
Happens when I open the NewKinectExamples project
Iâve added kinect.face.h, kinect.h and Kinect.INPC.H to the Unreal engine install, and Kinect20.face.dll and NuiDatabase to the NewKinectExamples project.
Unhandled Exception: 0xc06d007e
KERNELBASE
UnrealEditor_NeoKinectUnreal!__delayLoadHelper2() [D:\a_work\1\s\src\vctools\delayimp\delayhlp.cpp:312]
UnrealEditor_NeoKinectUnreal!_tailMerge_kinect20_face_dll()
UnrealEditor_NeoKinectUnreal!NeoKinect::KinectSensor::SetUseFrame()
UnrealEditor_NeoKinectUnreal!UNeoKinectManager::GetFaces() [D:\build\U5M-Marketplace\Sync\LocalBuilds\PluginTemp\HostProject\Plugins\NeoKinectUnreal\Source\NeoKinectUnreal\Private\NeoKinectManager.cpp:932]
UnrealEditor_NeoKinectUnreal!UNeoKinectManager::execGetFaces() [d:\build\U5M-Marketplace\Sync\LocalBuilds\PluginTemp\HostProject\Plugins\NeoKinectUnreal\Intermediate\Build\Win64\UnrealEditor\Inc\NeoKinectUnreal\UHT\NeoKinectManager.gen.cpp:204]
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_CoreUObject
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_UnrealEd
UnrealEditor_UnrealEd
UnrealEditor_UnrealEd
UnrealEditor_UnrealEd
UnrealEditor_Engine
UnrealEditor_Engine
UnrealEditor_UnrealEd
UnrealEditor_UnrealEd
UnrealEditor
UnrealEditor
UnrealEditor
UnrealEditor
UnrealEditor
UnrealEditor
kernel32
ntdll
Did you got success with Kinect SDK installation ? I am facing issue from last few days. IDK how to solve. Windows Forums are helpless.
Any way for your issue. Here is a help from forums
https://answers.microsoft.com/en-us/windows/forum/all/the-exception-unknown-software-exception/96d61106-2cf2-450c-b61d-9dc2ce72f695
Whatâs your Windows OS Version ? And Build number ?
Yeah, I have the SDK installed. The Kinect works fine in Touch Designer. Win 11, v10.0.22631
What if you add the âLower Filtersâ entry yourself?
Nick, it crashes when trying to activate the face features, indicating the dll is not where itâs expected. Are you sure you copied the dll into the Binaries\Win64
folder inside your project folder?
Unfortunately, due to Kinect being very picky about USB controllers, I canât yet verify the callstack looks like that when the dlls are missing. The USB card I used for Kinect stopped working, so Iâm waiting for a new one from Amazon, and hoping it works with Kinect v2. If I canât figure that out, I wonât be able to continue keeping the plugin up to date.
Hey,
Hereâs a screen grab of the where the kinect20.Face.dll lives. Unreal made a 5.3 copy of the project when I opened it, could that cause any issues?
I just need body tracking for my project, can I disable the face features if this cant be resolved?
Iâm not entirely sure you can disable it, but I think you can try to not access the functionality.
I believe itâs crashing when you try to PIE on the Examples level, right? If so, remove the BP_FaceTracking
Actor from the level. Thatâs the one using the face features.
At any rate, that looks like the correct file in the correct folder. Can you debug-run the project from a coding IDE to see if thereâs any info about the dll on the callstack? Maybe the address where itâs trying to load it from. Perhaps your instance of the project is not looking for it in Binaries\Win64
for some reason.
Do you use some shortcut to launch the project or are you double-clicking the .uproject
file? Shortcuts can change the startup
folder of an application, which is where the application looks for dlls and other things automatically. Iâm shooting in the dark at this point, but maybeâŚ
Iâll only be able to test this myself to check my missing dll error against yours once the new USB card arrives. IF Kinect decides to work with it
Got it working! Deleting the face example did the job.
So annoying that is so poorly supported now, the Kinect has a lot of interesting functionality for interactive art project.