Announcement

Collapse
No announcement yet.

Neo Kinect - easy access to the Kinect v2 capabilities in your games

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #76
    I'm really struggling to get this to work with any custom characters.

    I'm trying to get it to work with any of the characters in this free pack - https://www.unrealengine.com/marketp...validated=true

    All of them appear to use the same base UE4 skeleton that your plugin uses.

    I've tried swapping out the mesh in blueprint to one of these, but it always fails when casting to the animation blueprint.

    I've tried to update the animation blueprint to use one of those models, but it seems impossible to change. I'm really confused how to get this to work with another model other than the base model. I seem to have the same problems as somebody else in this thread, i've gone over your explanation and there's multiple times, but I just can't seem to make any progress with it. Any guidance would be much appreciated.

    Thanks!

    Comment


      #77
      Of course, as is usually the way, after posting, we solve the issue. It looks like those characters import with their own duplicate skeleton, which confuses things. When we manually re-import the FBX for them and then link to same skeleton as the one in the plugin, it then all works OK.

      The one problem we have though is that the tracking seems quite glitchy at times (we are using a Kinect V2). It's the same with the base model provided. We aren't really sure why this is - we've tried a few different positions of the sensor, etc. but sometimes the arms just seem to flip around and it looks weird. Is there anyway to smooth this out? There was mention of being able to access "tracking confidence" but we can't seem to locate this... It would be good to be able to only update when the tracking is relatively confident to eliminate these random glitches.

      Comment


        #78
        Originally posted by UKdude View Post
        Of course, as is usually the way, after posting, we solve the issue. It looks like those characters import with their own duplicate skeleton, which confuses things. When we manually re-import the FBX for them and then link to same skeleton as the one in the plugin, it then all works OK.

        The one problem we have though is that the tracking seems quite glitchy at times (we are using a Kinect V2). It's the same with the base model provided. We aren't really sure why this is - we've tried a few different positions of the sensor, etc. but sometimes the arms just seem to flip around and it looks weird. Is there anyway to smooth this out? There was mention of being able to access "tracking confidence" but we can't seem to locate this... It would be good to be able to only update when the tracking is relatively confident to eliminate these random glitches.
        I forgot to come to the forums for a while. Sorry!
        I'm glad you figured out the skeleton import solution

        That glitch is from Kinect, unfortunately. Sometimes it thinks your thumb is one side of the hand and other times it just flips, rotating the whole forearm. Confidence won't help much there. Smoothing will help a bit, but usually I just ignore the hands rotation and lock the forearm X rotation.

        For smoothing, on Tick you interpolate the new transforms from Kinect with the ones already registered in the AnimBP's transforms array, using Lerp. A is your old transform, B is the new one and Alpha is how much you want to interpolate from A (0) to B (1). Smaller values make it reaaaally smooth, but it feels laggy. I always multiply WorldDeltaSeconds by a value (I start testing with 13) and use it as Alpha. That way, the smoothing will be the same, no matter the frame rate change. Just clamp WorldDeltaSeconds*value to 1 to make sure that if DeltaSeconds is too high, your final result won't be higher than 1.

        To get joints confidence, you call GetJointConfidence from a NeoKinectBody object. You access the body objects when you call something like GetNearestBody. There's also a GetJointConfidenceAsExec, which looks like a Switch node.
        Freelancer Game Dev Generalist and Unreal Consultant | Portfolio
        Unreal products: Dynamic Picture Frames, Neo Kinect

        Comment


          #79
          Hello! I just bought your plugin, read quick start guide and still have a few questions First of all - there's no enum variable for particular body number - the one like "get joint location" where you can choose the joint - am I right? So the only way to have a reference to particular body number is by addressing body by index from the "get kinect bodies" array and promoting it to variable?
          The other question is about the second example in Demo Room (Color Mapping) where you overlay 3D pivots on the image that Kinect shows. I found that if I step back from Kinect for 3 meters or so - the pivots go behind the screen and my overlayed mesh disappears. Is it possible to push back the screen to increase the usable range between the screen and the player so that the overlayed mesh won't go behind the screen and still properly overlays camera image?
          The third one is about clamping Kinect's active range - how can I crop the distance Kinect should monitor?
          4. You wrote "What I do when I need mirror functionality is to either flip the Skeletal Mesh's Y scale to -1.0" - I've tried editing SK_Mannequin - tried to scale root to -1, tried to select all bones and scale them to -1 - but they won't flip in "Avateering Demo". How did you achieve that?
          5. The "Compensate Kinect Pitch and height" Function has a "Get Kinect Ground plane" node piped in select float node with A value set to 140. Is it the value which defaults to 140 cm from the floor when Kinect can't can't define ground plane? And hence I can manually input my sensor's real height from the floor in the "A" value field of "Select float" node to be precise - am I right?
          Last edited by psychedelicfugue; 03-12-2020, 09:34 AM.

          Comment


            #80
            Originally posted by psychedelicfugue View Post
            Hello! I just bought your plugin, read quick start guide and still have a few questions First of all - there's no enum variable for particular body number - the one like "get joint location" where you can choose the joint - am I right? So the only way to have a reference to particular body number is by addressing body by index from the "get kinect bodies" array and promoting it to variable?
            The other question is about the second example in Demo Room (Color Mapping) where you overlay 3D pivots on the image that Kinect shows. I found that if I step back from Kinect for 3 meters or so - the pivots go behind the screen and my overlayed mesh disappears. Is it possible to push back the screen to increase the usable range between the screen and the player so that the overlayed mesh won't go behind the screen and still properly overlays camera image?
            The third one is about clamping Kinect's active range - how can I crop the distance Kinect should monitor?
            4. You wrote "What I do when I need mirror functionality is to either flip the Skeletal Mesh's Y scale to -1.0" - I've tried editing SK_Mannequin - tried to scale root to -1, tried to select all bones and scale them to -1 - but they won't flip in "Avateering Demo". How did you achieve that?
            5. The "Compensate Kinect Pitch and height" Function has a "Get Kinect Ground plane" node piped in select float node with A value set to 140. Is it the value which defaults to 140 cm from the floor when Kinect can't can't define ground plane? And hence I can manually input my sensor's real height from the floor in the "A" value field of "Select float" node to be precise - am I right?
            Wow! Many questions. hahahha
            Let's go.
            First, thanks for buying the plugin! I wish you have a good time using it!

            1. About Enums. No, there aren't any for the body indexes. They are 6, fixed, and I use int to do some trickery with the values on the index texture. For the body joints, yes, you need a body instance to get joints information.

            2. On the second example, the plane is adjusted dynamically with its distance from the camera. So it should still work just fine, filling the screen (from the correct POV) if you drag it a bit further from the camera in the blueprint and drag the blueprint back from the wall in the level.

            3. Cropping the interactable area is not a Kinect feature, so it's also not a function ready to use in the plugin. What I do when I want to limit it, I keep a loop on tick that iterates over all bodies. I check on the tracked ones if the pelvis joint is close to where I want it to before giving the user any kind of feedback.

            4. On BeginPlay of the AvateeringDemo I set the transform of all meshes to the default, with scale [1, 0, 0], so it will override if you change it in the mesh settings. But scaling the root or the Blueprint itself in the level should have worked. The joints are transformed in component space, which respects the parent Actor's transform.

            5. You are exactly right. That 140cm was my default in case it doesn't find a ground plane. If you know your sensor height exactly (like in a fixed installation for an event or something), I wouldn't even bother trying to get the detected height. I'd just go directly with the value I know is correct. Because even when it finds the floor, it might fluctuate a bit sometimes.
            Last edited by RVillani; 03-13-2020, 01:26 PM.
            Freelancer Game Dev Generalist and Unreal Consultant | Portfolio
            Unreal products: Dynamic Picture Frames, Neo Kinect

            Comment


              #81
              Thank you for good wishes and for your answers. Will now continue experimenting applying the knowledge you kindly shared!

              Comment


                #82
                Hello,I've recently purcashed your plugin and have a couple of questions.Since i have some experience with UE4 but not with kinect integration of such engine I'd like to ask a few questions:

                Is it possible to display the joint orientation value as text next to the joint?I tried on the default provided skeleton meshes but it didn't work.For example hip and shoulder.

                Same goes for joint anglen visual studio I had to set the starting,mid and end joint and it would calculate the angle set on the mid joint.How do I setup the blueprint so that it does that way or is there another way which I haven't looked.

                Is it possible to calculate and display height of the user on the skeleton mesh(and display it)?

                Last but no least a question:is it possible to display the advanced features like HighDetailFacePoints of Face Tracking function?i mean not only eyes,mouth and nose but these:
                HighDetailFacePoints_Leftcheekbone
                HighDetailFacePoints_Rightcheekbone
                HighDetailFacePoints_LowerjawLeftend
                HighDetailFacePoints_LowerjawRightend
                Thanks in advance

                Comment


                  #83
                  Originally posted by Gen.PeanutButter View Post
                  Hello,I've recently purcashed your plugin and have a couple of questions.
                  Thanks for purchasing the plugin!

                  Originally posted by Gen.PeanutButter View Post
                  Is it possible to display the joint orientation value as text next to the joint?I tried on the default provided skeleton meshes but it didn't work.For example hip and shoulder.
                  How did you try it? It's certainly possible, since Unreal has a good UI system. The simplest way I can think of doing it is with Text Render components. Keep updating their location on Tick. To keep them facing the screen, you can use the FindLookAtRotation node to set their rotation looking at the camera.

                  Originally posted by Gen.PeanutButter View Post
                  Same goes for joint angle: on visual studio I had to set the starting,mid and end joint and it would calculate the angle set on the mid joint.How do I setup the blueprint so that it does that way or is there another way which I haven't looked.
                  If you're talking about angle between two joints, the three point method you mentioned is how I'd do as well. If you want a method to simply use the two joints only, you get their rotations and, from those, get their forward vectors, their dot and convert that to an angle with acos (degree or radians).
                  Click image for larger version  Name:	NeoKinectRotation.jpg Views:	0 Size:	87.3 KB ID:	1751924
                  Just keep in mind that, for some joints, you need to invert the Forward Vector (*(-1)), as their X axis points in the opposite direction. Most of the right side ones are like that.

                  Originally posted by Gen.PeanutButter View Post
                  Is it possible to calculate and display height of the user on the skeleton mesh(and display it)?
                  Displaying is like the joints' info above. Unreal provides you with many ways to do it: UMG, Text Render, Print String etc. As for calculating it, Kinect doesn't give you the precise location of the end of the head, but it does for a point in it. You can use that in combination with GetKinectGroundPlane. It returns a Height value that you can subtract from the Head joint's Z value to get the height of it from the floor. Check if Height is not zero first, because sometimes Kinect doesn't recognize where the ground is. After you calculating Head - Floor Height, I'd multiply it by some percentage to get it closer the actual height of the top of the head. Maybe something like 1.03 or so, to compensate for the head joint being in the middle of the head.

                  Originally posted by Gen.PeanutButter View Post
                  Last but no least a question:is it possible to display the advanced features like HighDetailFacePoints of Face Tracking function?i mean not only eyes,mouth and nose but these:
                  • HighDetailFacePoints_Leftcheekbone
                  • HighDetailFacePoints_Rightcheekbone
                  • HighDetailFacePoints_LowerjawLeftend
                  • HighDetailFacePoints_LowerjawRightend
                  Thanks in advance
                  Sorry, but no. I can't remember right now why I didn't implement those, but I think there was some mesh dependency I had no idea how to bring into Unreal. And also its lack of precision for whole-body distance was frustrating.
                  Last edited by RVillani; 04-27-2020, 07:12 PM.
                  Freelancer Game Dev Generalist and Unreal Consultant | Portfolio
                  Unreal products: Dynamic Picture Frames, Neo Kinect

                  Comment


                    #84
                    By the way, I've noticed some strange stuff going on: I can't set Cascade Particle System Beam's Start and End points location if I spawn the beam and try to set location choosing Joint Location Color node directly as a source. The Beam's start and end points are set with some strange offset in this case. To overcome this I spawn some geometry, set it's position using Joint Location Color node and then take the world location of spawned geometry as a source to feed into Set Beam Start Point and Set Beam End Point nodes.
                    Does this plugin have a way to show World Location Coordinates?
                    Last edited by psychedelicfugue; 05-03-2020, 08:23 AM.

                    Comment


                      #85
                      Originally posted by RVillani View Post
                      Thanks for purchasing the plugin!


                      How did you try it? It's certainly possible, since Unreal has a good UI system. The simplest way I can think of doing it is with Text Render components. Keep updating their location on Tick. To keep them facing the screen, you can use the FindLookAtRotation node to set their rotation looking at the camera.


                      If you're talking about angle between two joints, the three point method you mentioned is how I'd do as well. If you want a method to simply use the two joints only, you get their rotations and, from those, get their forward vectors, their dot and convert that to an angle with acos (degree or radians).
                      Click image for larger version Name:	NeoKinectRotation.jpg Views:	0 Size:	87.3 KB ID:	1751924
                      Just keep in mind that, for some joints, you need to invert the Forward Vector (*(-1)), as their X axis points in the opposite direction. Most of the right side ones are like that.


                      Displaying is like the joints' info above. Unreal provides you with many ways to do it: UMG, Text Render, Print String etc. As for calculating it, Kinect doesn't give you the precise location of the end of the head, but it does for a point in it. You can use that in combination with GetKinectGroundPlane. It returns a Height value that you can subtract from the Head joint's Z value to get the height of it from the floor. Check if Height is not zero first, because sometimes Kinect doesn't recognize where the ground is. After you calculating Head - Floor Height, I'd multiply it by some percentage to get it closer the actual height of the top of the head. Maybe something like 1.03 or so, to compensate for the head joint being in the middle of the head.


                      Sorry, but no. I can't remember right now why I didn't implement those, but I think there was some mesh dependency I had no idea how to bring into Unreal. And also its lack of precision for whole-body distance was frustrating.
                      Thanks,will try to replicate and test some more

                      Comment


                        #86
                        Originally posted by psychedelicfugue View Post
                        Does this plugin have a way to show World Location Coordinates?
                        The coordinates from Joints are relative to the Kinect sensor. So, if a joint is 200cm from it, it's gonna be 200 Unreal units. If you create a cube or something in the world to represent the sensor and make that the parent of the things you're positioning with joints' locations, you will see that relativity.
                        As for converting it to World Space, you can do that by taking the world transform from an actor you want the joints to be relative to, call Transform Position from the transform and pass the joint location to it.
                        Freelancer Game Dev Generalist and Unreal Consultant | Portfolio
                        Unreal products: Dynamic Picture Frames, Neo Kinect

                        Comment

                        Working...
                        X