Kinect 4 Windows v2.0 Plugin

So excited for this. A simple Kinect tie in with UE is something I’ve been wanting for awhile. Unity makes it so easy.

Lion your github repo is 404. :frowning:

how bad is the response time?

how bad is the response time?

I think the NDA prevents testers from discussing specifics of the kit.

Got my kit today, looking forward to doing some tests of my own :slight_smile:

Hello all!

Sorry I couldn’t update this thread earlier but things at my day job are kinda crazy for the last few weeks.
(my day job -> The Mini Mobile Robotic Printer by ZUtA Labs Ltd. — Kickstarter)

I made few advances but they are really preliminary and in need for more work.

I hope to resume my work on the plugin ASAP

This is extremely cool!

But I am wondering if you have any idea of whether this will play nice with the positional tracking added to the Oculus Rift DK2? I’m concerned that the systems will confuse each other. I am really hoping to see a VR experience in which the hands can be used to interact with the game world and their positions reproduced accurately. From what I’ve read I’m not sure the Leap Motion is the right tool for the job, as I am more interested in the hand position over a large area than in the precise position of individual fingers.

This has huge potential - especially when used with Oculus Rift, if you can capture your own movements and realise them in a 3D world, and free yourself from using keyboard/mouse/joypad - that is true VR.

p.s. I dont think using a hand gesture like that would be very good for firing a weapon though - players will want speed and accuracy - and thats something only a trigger or button can provide. :confused:

That is the main reason I am doing it.

I have a game in mind :wink:

Leap doesn’t have enough range for motion control. If you want to combine tracking there shouldn’t be a problem, you just have to setup how your character skeleton interprets the information and applies it.

I’m setting up a github repo I’ll share my code there.

Hi , thanks for sharing your code! Is this Github page still up? I tried to access it today and only found a 404 page.

It’s not, mainly because I changed my approach to the problem and actually im already working on retargeting a mesh.
If there is someone that can help me to retarget the kinect coordinate system to the unreal coordinate system it will speed things up a lot.

It’s not, mainly because I changed my approach to the problem and actually im already working on retargeting a mesh.
If there is someone that can help me to retarget the kinect coordinate system to the unreal coordinate system it will speed things up a lot.

We would be willing to help out, we have some experience doing this with the first version of the Kinect. Perhaps we could collaborate?
We’d also be interested to see the first approach, if you still have the code around, even if it isn’t very complete.

Can you share your approach? I’m essentially doing the same thing, but making sure to take an approach generic enough for other trackers. I’ve already got most of it working follow essentially this approach. However, I was trying to keep things purely plugin based (custom actors and controller wrapped up in a plugin) and not changing the main engine, but really the proper architectural fit is to use the standard input stack and modify the GenericApplicationMessangeHandler class to include an OnPoseAction event with a TArray of FRotators as the parameters. Or perhaps a named bone -> FRotator map.

I use a simple listener/dispatcher pattern.
the kinect runs on it’s own thread and basicly just dispatches processed kinect to listeners.
I was unsuccessful in adding it in to the GenericApplicationMessangeHandler without altering the engine itself.

Yea there are a lot of places it touches GenericMessageHandler, UPlayerInput, PlayerController, etc. especially for a module that is pretty windows specific at the moment. I don’t want to go fussing around and diverge so far from the source until we get a more generic data path from Epic. Something that can work for any skeletal mesh style input (Kinect 1/2, PrioVR, etc.). Once I get a couple trackers working in a non-generic way I’ll post my thoughts in the feature request forums.


so I really want to upload the code but i have some trouble with github :\

also after thinking it over i decided to add additional module to the engine itself to add support for custom hardware input.

so you should be able to get my code from my github now

fork it because I made changes to the engine itself.

I built your version of UE4 and have enabled the Kinectv2 plugin.
How do you set up your character?

Or if you have an example project with it running that would be great :slight_smile:

I haven’t had the chance to properly write an actor class yet.
But you can inherit a blueprint from AKiniectListenerActor and you will have access to OnBodyReciev event but for now it is abit useless because I haven’t written the conversation from FBodyFrame to FTransform.

Sorry I didn’t read everything, I just watched the video :stuck_out_tongue: did you code or use blueprints?

Hi Lion32,

I don’t know how I’ve missed this thread, but I am quite impressed with your progress thus far! Are you planning on providing your plugin for Github or the marketplace? I can’t wait to see more of what you come up with!