Kinect 4 Windows v2.0 Plugin

[=;187055]
I tried to get a blueprint running to get my Kinect to move a character in game for me in a test room I made. However, I was not receiving any signal and reading through here it looks like I was just missing some initial setup steps. Here is where I am at now…the problem here is that what I was reading above from RuBa’s post with the example Blueprints does not compile. Did I do something wrong here? It is saying my events are not compatible with the bindings I gave them for some reason, did you run into this issue too RuBa? My blueprint here is basically a mashup of the blueprint by RuBa and the demo by the OP with the bone rotation (which by the way I can’t seem to get working either, so if someone has got that working at least that would be a great help).
[/]

I did have some weirdness around creating the event bindings. I’m not sure why this is but, as with most things, there is more than one way to do it. I noticed I had this problem when I made the event to bind to myself. I can’t remember everything exactly right now as I’m not in front of my dev box but when you do the binding I think there are two options (maybe “bind” and “assign”) try both and the one that creates the event binding for you (draws the little red line and make a new event node) is the one you want to use. I have no idea why this happens but it was yelling at me for a similar thing.

Can you describe the issue you are having with the bone rotations?

Also, I wouldn’t recommend using the transforms when you break the kinect body. I know it says that it’s deprecated but use the break the gives you the bones and use the info on the bones. That seems to be only reliable way I can get things to work. For some reason the transform data isn’t the same as the rest making it very hard to work with. I’m not sure if the mappings are different or if something is off with the rotation but it is just too difficult to work with and I haven’t had the time to look into it.

That being said, IMHO, the ideal way to do the animations here would be to set up an IK system rather than an FK system. Meaning the only bone in an arm that you would track would be the hand (or wrist) and retarget the elbow and shoulder based on that. It would be a similar feel to how A.R.T. works when using the hand IK control. Then you could also do it based on location rather than the rotation which I feel is easier to work with in general. This would give you a couple of big wins. The first being that you would be tracking less bones which means less noise on the input. The second is that it’s less data to manage. This isn’t so big but it’s nice to have to manage less data from an API.

The big downside to using the kinect bones over the transforms is that you will need to write your own filter. The one euro filter that is there only works for transforms. This isn’t all that bad and it gives you a bit more control over the filter. I redid the one euro in BP for the kinect bones and I’m getting decent results. I did it a bit differently though. I have one filter per parameter. Meaning if I am using a filter on the shoulder rotation I have a separate filters for yaw, pitch, and roll. This lets me filter each piece of data independently as each piece has its own noise and therefore needs its own filter. This also lets me set the filter params per piece of data. This also makes it extremely tedious but with some reasonable defaults and the option to override or ignore each filter it’s not that bad.

Just some food for thought.