Neo Kinect - easy access to the Kinect v2 capabilities in your games

I can’t help with that. It’s coming from some Blueprint called MHK and you’re trying to read an index (2, I think) in some array that doesn’t have 3 items (so index 2 doesn’t exist), then you try to Set Visibility on that item that doesn’t exist.

Or the array has that number of elements, but the item you’re trying to access is invalid (maybe an Actor that was destroyed, or you added the item but never assigned a value to it, possibilities are endless).

1 Like

Hello @RVillani As you said replicating joints transform from AnimBP and replicating whole actor and all variables isn’t helping me out for replication. Or Should I multicast ? Or could you please send any images or BP code for doing this. Please :pray:

I strongly suggest you take a course/tutorials on replication in Unreal.
I could send you screenshots and, even if you copied them perfectly, it could still not work depending on the rest of your project setup.

Notwithstanding, I’ll list the most common issues that come to mind:

  • Ownership: you can only call Server RPC from Actors owned by the local PlayerController. Check if your RPCs are working as expected in Server and Client using PrintString.
  • If using replicated lists, note that Array replicates, but Set and Map don’t.
  • To test whether your variables are replicating as expected, make them replicate with Notify and PrintString from the notify function to see if it’s being called. It should be called on clients whenever the variable changes on the server.
  • Clients can’t replicate changes to other clients. All changes have to be sent via Server RPC to the server and then they replicate to clients from there.

If those are not your issues, PrintString or add break points in other places, checking if things are working as you assume they are, until you find where it’s not.

I would like to use unreal engine with two Kinect for windows V2 sensors for motion capture. have you thought of trying to get that to work? there is a program like Brekel that uses two sensors found here: Brekel Kinect (free) » Brekel I would use this but I would like to be able to just do the motion capture in unreal engine.

I thought about it. But the plugin only talks to a single sensor and adding support for multiple would require quite the overhaul, unfortunately. That would risk great instability due to the amount of changes required.

maybe thoughts for a version 2. for now i’m gonna just get brekel since it has accuracy from front and back with ability to have many sensors not just kinect. been thinking this through for a while. thank you for answering my question. i’ll keep checking yours out for more feature developments.

1 Like

how to use gesture?

The Kinect v2 SDK doesn’t provide gesture functionalities, so neither does my plugin. So, to answer your question, you need to create your own logic to recognize gestures, by using the available data from the joints, like location and rotation.

Here’s a simple example in C# for recognizing a hand wave. Even if you don’t understand code, the text explaining the logic is pretty neat, and it would be simple to implement it in Blueprint with Neo Kinect.

Hello
Does anyone have made a virtual dresser ? I read it was doable in the Q/A but I as wondering if there were some update since 18.
Best regards

I have created the plugin for exactly that when I was partner at a company. But due to the agreement I made when I quit, I can sell the plugin but not with a straight-up-done virtual dresser template.

However, in the Q/A section, the question “Any advice to create a virtual dresser?” has pretty much all the steps I used myself (more than once) to make one.

Cheers

thanks, I’ll try !

1 Like

Hello, I want to use neo kinect to drive simple model movement through far and near data in ue5, but I don’t know how to use this neo kinect :sob:

Have you downloaded the demo project and quick start doc? Both are in the product description on the market place and the original post here.

The demo project has Blueprint examples of the main functionalities.

Hi,

I’ve been trying various methods to add sound effects to the character’s hand and leg movements in the BP_AvateeringDemo, but I just can’t seem to figure it out.
Specifically, I’m aiming to have distinct sounds for each SkeletalMesh0 or when the Kinect detects the skeleton’s movements.

If anyone has any experience with this and could share some examples or tips on how to make it work, I’d be incredibly grateful.

hi @RVillani, congrats for the top contribution ::
as @paranoio pointed out, in the ue5 example, there seems to be a rotation error around the upper arm joints.
one year ago, under ue4, i remember there was no such error with the default mesh.
is this something addressable ?
thanks

Hi @RVillani , firstly this is a great plugin.

I was just wanting to know if it was possible to capture coordinate information for other types of objects using this plugin or whether this is something you have considered developing? (say a ball or box in the physical environment)

I am currently looking at mixed reality applications and having physical objects also have virtual representations.

Thanks for you time

Hi, thanks for corroborating the report. Would you mind sending me a video pointing out the issue, with a description of what you’d expect it to look like? I’ll see if/how I can make it look better.

Cheers

No, unfortunately Kinect will only recognize users as something you can get data from. For objects and everything else, all you can do is use their depth data for things like masking 3D objects behind real ones, for instance.

1 Like

Ah I see, thanks for getting back to me