Neo Kinect - easy access to the Kinect v2 capabilities in your games

Tried manually adding too. No luck.

Nice! I mean, not really because something’s weird with the face features, but at least you can use everything else.

My new USB card worked (got one with the Renesas chipset). My callstack from not having the dll is similar to yours, but not quite the same.

While testing, I’ve made some improvements:

  • It won’t crash anymore when the dll is missing. It can detect if it loads successfully now and log an error message.
  • If the dll and NuiDatabase folder are in the project’s Binaries/Win64 folder, they’ll get automatically copied to the packaged folder, no longer requiring the user to remember to do it manually.
  • Added tracing for Insights profiling on the methods most likely to cause hitches. I’ve not seen that happen, but it’s there if needed.

I’m publishing those as an update to the existing 5.3 version and then I’ll test with 5.4 and publish that one too.

Did you manage to install the SDK? Does the Configuration Verifier display everything with a green checkmark?

I had to:

  • install the SDK;
  • restart Windows;
  • add the registry fix by creating an empty LowerFilters entry myself;
  • restart Windows again, and then it worked.

This is what my Configuration Verifier looks like:

Even though it doesn’t like my USB controllers, the Renesas card I bought worked. It has to be Renesas or Intel eXtensible Host Controller to work. This is the card I bought on Amazon.

And this is my Windows version:
image

Unfortunately Big NO.

Firstly I need to get success with Kinect SDK right to follow up next steps.Like Restart and add Registry Keys. So, it’s not. Windows is throwing errors.

And I have also checked with Kinect Configuration Verifier. Everything is greenary (Luckily I have bought official adapter from Microsoft in the beginning of era) Not 3rd party one.

Most crazy part is same Windows Build version.

Here is error

Somehow after many restarts of PC installation got success. :partying_face:

Thank you for your time @RVillani :fist_right: :fist_left:

1 Like

However kinect is in restart loop mode. Even though I have added Lower Filters in Registry Editor.

Googled it. None of then even worked. Will update any hope is found

UPDATE : It’s working after new windows update and NO MORE kinect restart loops.

Greetings! I bought plugin yesterday! Great one btw. I’m trying to setup a greenscreen but i cannot figure out how to merge the 2 Frame Types needed as in your response " Use the Body Index in Color Space (I’ll call it BICS) frame inside a material together with the Color frame and set to 0 opacity when BICS value is greater than 5." by blueprint… I looked in examples from widgets/levels and blueprints but i couldn’t find exactly how to continue from Create Dynamic Material Instance for a dedicated BP_KinectFrame_GreenScreen… Maybe do you have some other indications,tutorials or screenshots? Thanks for attention!

Thank you for buying the plugin! I’m glad you like it :smiley:

That logic you quoted, you’ll set it up in your material. The one you pass the textures to after using Create Dynamic Material Instance.

In a material, create two texture parameters, one for the BICS texture and one for the Color.

From the BICS texture, get the R output and multiply it by 255, to convert Unreal’s 0 to 1 range into the actual 255 that this texture represents. Then, you can use If nodes in the material to set the Opacity output depending on the user index you’re getting from the texture. You could make it visible for any user by setting the output to 0 for any values greater than 5. Or you could add a Scalar Parameter to control which user you want to have visible via Blueprint.

The Color texture, use it as Emissive output in the material.

Then, in your Blueprint, apply that material to the plane that will represent your green screen and set the texture parameters after creating the dynamic material instance.

I made this tutorial a while ago, about controlling materials from Blueprints. My English was barely decent and I still had the terrible idea of having a background track, but I hope it can help.

Cheers

1 Like

Thx for instructions! Maybe you have some other tips for adding a niagara glow or any other way to smooth a bit the pixelated view?


Hi, is it possible to use the plugin to drive the eyes of the metahuman? So the metahuman is always looking at you even if you move in the camera feed. Also is it possible to connect two kinects at the same time to get better motion capture results? thanks

For Niagara, move the system with a joint every frame. If you’re using skeletons, you can parent the system to a bone.

For the pixelated video… I don’t know. Maybe a smart blur filter (you’d have to do it manually in the material)? Or a sharper filter (maybe manually too, I don’t think UE has one of those ready).

1 Like

Hi, the plugin works with a single sensor only.
I haven’t used Metahumans, but if you can control their eyes from Blueprints (I can’t think of why not), that should be possible. However, a 2D screen won’t help with the illusion that they’re looking at the user for the user themselves. Only people that are not the user might be tricked into thinking the Metahuman is looking at another person. Think of a picture of a person looking at the camera. No matter from where you look at the picture, it always looks like it’s looking at you, because the picture, and the screen, are 2D and lack depth.