Announcement

Collapse
No announcement yet.

Neo Kinect - easy access to the Kinect v2 capabilities in your games

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #46
    RVillani, thank you for your response regarding the hip bones not being mapped correctly, and the suggestion to use the default mannequin as the example. Your suggestions seems to have gotten us where we need to be. Thanks a lot!

    As an unrelated, follow up, do you have any guides/tutorials for extending the Color Mapping example (#2, in NeoKinectDemos level) for use with a character model... sort of like Avateering in AR. It seems like all the pieces are there, but I am still a beginner with Unreal. Any suggestions would be appreciated.

    Thanks again for this great plugin!

    Comment


      #47
      Originally posted by C9Uie49z04Km View Post
      do you have any guides/tutorials for extending the Color Mapping example (#2, in NeoKinectDemos level) for use with a character model... sort of like Avateering in AR. It seems like all the pieces are there, but I am still a beginner with Unreal. Any suggestions would be appreciated.
      Some pieces are there, but there are still some stuff specific for this case that you'll need to do.

      Noise filtering for Kinect jitter: just lerp location and rotation values from one frame to the other. For the Lerp's alpha, multiply some value by delta seconds in order to keep the same filter strength among different frame rates.

      You'll want to set the location of the pelvis only. The rest of the bones you should just scale and rotate. On BeginPlay, before transforming the skeleton in any way, store each of its bones's length. That way, you're able to scale them to fit the user bone lengths.

      The KinectBody class has a function to get the user bones's lengths. The function returns the length from the joint you select to its parent joint. For instance, if you select Left Elbow, it'll be the length from that to the Left Shoulder joint.

      I advice you to, on each frame, get the average target scale for the pelvis and spine mid bones and use the average of those as YZ scale for the whole body. Then only scale all bones's X axis according to the user's proportions. That way you avoid the scaling creating a thin arm with a thick forearm and other weird stuff like that. Also, you'll want the joints locations in your skeleton to match as close as possible where Kinect locates them in the user. For instance, the shoulder joint is really towards the body middle. It's weird, but the AR avateering will work much better and distort less your skeleton if you match the joints to where Kinect thinks they are.

      There might be some additional stuff you'll need to figure out, but these things I pointed out are the base of what you need. From there, I believe you can figure a lot out as you test.
      Last edited by RVillani; 04-25-2019, 05:40 AM. Reason: add missing lerp's alpha input explanation
      Freelancer Game Dev Generalist and Unreal Consultant | Portfolio
      Unreal products: Dynamic Picture Frames, Neo Kinect

      Comment


        #48
        @RVillanu, thanks for your comments

        Noise filtering for Kinect jitter: just lerp location and rotation values from one frame to the other. Multiply some value by delta seconds in order to keep the same filter strength among different frame rates
        I've decided to tickle the Color Mapping for now, and stick with straight Avateering. However, I am hoping use the above concept to smooth the animation.

        Thanks again for your help!

        Comment


          #49
          I recently bought the plugin, and I've been trying to access the highest depth value and turn it into a 3D position but I just do not get the depth value T_T, I would appreciate any idea you could give me.

          Comment


            #50
            hello, if i buy the plugin, does it come with source code? I'd like to basically use your framework as base and add some custom code for some custom sensors

            Comment


              #51
              Hi, I just bought the plugin and it works great. The mo-cap tracking works smooth with no drop in the fps even with 6 people simultaneously. I have a doubt, is there a way to flip/mirror the mo-cap data to make the animation look like a reflection of us. Right now, the tracking stays true to the data i.e if we move our Right hand, the character moves its right hand which is opposite to how a mirror works. I am fairly new to Unreal Engine, so not sure if this can be solved in the unreal engine or the plugin settings.

              Comment


                #52
                Hi Rodrigo,

                The plugin is amazing. It works perfectly.
                However, I've was wondering if it's possible to change the character in the example project provided ? As creating a new project file with a blueprint from scratch seems very difficult. I have just started using Unreal Engine recently, so have very limited knowledge on using blueprints.

                I am planning to create a custom character with the same skeleton structure as the default mannequin so it should be easy to use with the plugin.

                Things I've tried
                - In the Avateering demo blueprint, I tried changing the skeletal mesh in the 'SkeletalMesh0-5' from SK_Mannequin to a custom mesh but it generates errors when I Play the level.
                - Tried attaching a custom skeletal mesh in the demo blueprint but it again generates errors

                It would be of great help for us beginners if you can provide a small basic tutorial which explains how to replace the characters in the demo file with our own characters or how to setup a new blueprint for using the plugin with our own characters. If not, can you please point out to any particular video which I can refer to for setting up the nodes for this.

                Comment


                  #53
                  I'm really sorry, everybody! I haven't got ANY notifications for this thread since C9Uie49z04Km's last comment!

                  Originally posted by xsdev View Post
                  I recently bought the plugin, and I've been trying to access the highest depth value and turn it into a 3D position but I just do not get the depth value T_T, I would appreciate any idea you could give me.
                  You need to activate the usage of one of the Depth frames with SetUseFrame first. Aside from that, how are you trying to access the values?


                  Originally posted by idacquis View Post
                  hello, if i buy the plugin, does it come with source code? I'd like to basically use your framework as base and add some custom code for some custom sensors
                  Yes and no. The source for the Unreal Engine part is all there, but it works on top of a pre-compiled lib that is what talks directly to the Kinect. I can't release the source for that lib as it was the agreement with my ex-associate in order for me to sell this plugin. I'm afraid the Unreal part won't help you much as most of the heavy lifting of processing frames is made on the lib. Unless you want to modify how the data is converted to textures and multithreaded in Unreal. That part is all in Unreal code.


                  Originally posted by PostOfficeStudio View Post
                  Hi, I just bought the plugin and it works great. The mo-cap tracking works smooth with no drop in the fps even with 6 people simultaneously. I have a doubt, is there a way to flip/mirror the mo-cap data to make the animation look like a reflection of us. Right now, the tracking stays true to the data i.e if we move our Right hand, the character moves its right hand which is opposite to how a mirror works. I am fairly new to Unreal Engine, so not sure if this can be solved in the unreal engine or the plugin settings.
                  I think I answered you on private, but for everybody's sake, here goes: the plugin won't flip as that would increase the math computations per detected user a lot for everything to work besides tracking (like reprojections of locations from one frame type to another). What I do when I need mirror functionality is to either flip the Skeletal Mesh's Y scale to -1.0 or to use a post-process material that does One Minus on the U coordinate of the scene texture to mirror the whole scene. Both methods will require you to mirror the Skeletal Mesh's textures if they have logos or writings and the scale flip method might break clothing simulation. That's why I've used mostly the post-process method.


                  Originally posted by Harsh Supari View Post
                  Hi Rodrigo,

                  The plugin is amazing. It works perfectly.
                  However, I've was wondering if it's possible to change the character in the example project provided ? As creating a new project file with a blueprint from scratch seems very difficult. I have just started using Unreal Engine recently, so have very limited knowledge on using blueprints.

                  I am planning to create a custom character with the same skeleton structure as the default mannequin so it should be easy to use with the plugin.

                  Things I've tried
                  - In the Avateering demo blueprint, I tried changing the skeletal mesh in the 'SkeletalMesh0-5' from SK_Mannequin to a custom mesh but it generates errors when I Play the level.
                  - Tried attaching a custom skeletal mesh in the demo blueprint but it again generates errors

                  It would be of great help for us beginners if you can provide a small basic tutorial which explains how to replace the characters in the demo file with our own characters or how to setup a new blueprint for using the plugin with our own characters. If not, can you please point out to any particular video which I can refer to for setting up the nodes for this.
                  You might be missing a setting when you import your character. When you import a FBX as Skeletal Mesh you have a Skeleton setting. Click that and search for the mannequin skeleton to make your character share that same skeleton. If it's not on that list, click on its Eye icon, at the bottom, and activate Show Engine Content. That's it. Now your skeletal mesh will just be a custom mesh for the same skeleton the blueprints are already using. Beware though: even if your skeleton's bones names and hierarchy are the same as the Mannequin's, if the orientations don't match, you're in for a lot of weird "bugs".
                  To check, open the Mannequin skeleton's Asset Editor (double click it) and select each bone with the Move tool in Local Transform mode (the Globe/Cube icon next to the transform tools on the top right should be in Cube mode). That will show you the directions of the axis for each bone. Yours should match those exactly! I always start my skeletons for Kinect usage from exporting the Unreal Mannequin's skeleton and working on top of it, to be on the safe side. I had fixed other artists rigs for that purpose, but it's no fun haha. Just keep in mind that in your DCC software one of the axis might be inverted because of Unreal's coordinate system. I'd import the Mannequin skeleton anyways to check side by side with the character's skeleton in the DCC software's own coordinate system.
                  As for your second tutorial idea (how to setup a blueprint from scratch), I made the blueprints to serve as a tutorial themselves. I understand some previous Unreal knowledge is required but it's something required to use a code plugin too, so I kinda expect that from the user.
                  And, with all this information, the only link I suppose I should point you to is the FBX Import Options Reference.


                  Again, I apologize to all of you for not answering sooner! And hope to have helped.
                  Last edited by RVillani; 09-17-2019, 01:44 PM. Reason: Fix grammar errors
                  Freelancer Game Dev Generalist and Unreal Consultant | Portfolio
                  Unreal products: Dynamic Picture Frames, Neo Kinect

                  Comment


                    #54
                    Originally posted by RVillani View Post
                    You might be missing a setting when you import your character. When you import a FBX as Skeletal Mesh you have a Skeleton setting. Click that and search for the mannequin skeleton to make your character share that same skeleton.
                    .
                    .
                    .
                    .
                    And, with all this information, the only link I suppose I should point you to is the FBX Import Options Reference.
                    This works like a charm. Thank you for the detailed explanation.

                    Comment


                      #55
                      Hello. I want to get a texture there is only body pixel, Alpha = 1 where there's a body, 0 otherwise.

                      Comment


                        #56
                        Hi, I am working on one project and was thinking to buy your plug in. I was wondering if I can use it without an avatar? In my project a few people at the same time should interact with the objects made in unreal, projected on the wall in a big room, but I don't need to see any character, basically just tracking movements without visualizing it in the projection. Would that be possible using your plugin? I am sorry it might be a silly question, its just I am quiet new to unreal and blueprints, so I discover as I go. But I saw the only way to connect unreal and Kinect is with your plug in. Any answer would be really appreciated. thank you

                        Comment


                          #57
                          Hey Rodrigo, I was wondering if there's a way to track only one person at a time. I am not sure if 'Get Nearest Kinect Body' can be used to achieve this. If not, is it possible to limit the Far-range of the kinect so it ignores any bodies being tracked at a particular distance.

                          Comment


                            #58
                            Okay, this time it took me long to answer because the forums were down for maintenance. It wasn't me!! lol

                            Originally posted by XO73 View Post
                            Hello. I want to get a texture there is only body pixel, Alpha = 1 where there's a body, 0 otherwise.
                            I guess you'll be using that texture in a material, in which case only the material output would matter. You can use the Body Index Frame in Color Space and the Color Frame textures in the material then get the Red channel from the Body Index and pass it to a Ceil node. The result will be 1 if there's a body or 0 if there's not. Then you can use the Color Frame texture as color/diffuse. Just be aware that the Body Index texture comes from a lower resolution sensor than the color one, so the opacity will be pixelated if used on full screen.


                            Originally posted by mashasha View Post
                            Hi, I am working on one project and was thinking to buy your plug in. I was wondering if I can use it without an avatar? In my project a few people at the same time should interact with the objects made in unreal, projected on the wall in a big room, but I don't need to see any character, basically just tracking movements without visualizing it in the projection. Would that be possible using your plugin? I am sorry it might be a silly question, its just I am quiet new to unreal and blueprints, so I discover as I go. But I saw the only way to connect unreal and Kinect is with your plug in. Any answer would be really appreciated. thank you
                            Hi, thanks for your interest in the plugin! Yes, you can use it without any visible avatar. The plugin itself has no avatar, it only outputs the locations and rotations for the detected users' skeletons (up to 6 users at once). The avateering blueprint in the demo project for the plugin was created with the standard Unreal skeleton and is controlled using the plugin nodes in blueprints. The plugin doesn't need a skeletal mesh to work by itself.

                            Freelancer Game Dev Generalist and Unreal Consultant | Portfolio
                            Unreal products: Dynamic Picture Frames, Neo Kinect

                            Comment


                              #59
                              Originally posted by Harsh Supari View Post
                              Hey Rodrigo, I was wondering if there's a way to track only one person at a time. I am not sure if 'Get Nearest Kinect Body' can be used to achieve this. If not, is it possible to limit the Far-range of the kinect so it ignores any bodies being tracked at a particular distance.
                              That is possible in several ways, all of them controlled by you. Kinect will recognize people up to about 4.5m away from it. That's the sensor side of things. In your Blueprint you can either use 'Get Nearest Kinect Body' to always use the closest user tracking data, but then your interaction would switch from user to user automatically.

                              If you want more control, you can use all users tracking data (NeoKinectBody) at once and do your own logic for which KinectBody controls your interaction depending on where they are, or if they are making a required pose etc. In my projects it usually goes like the following.

                              Blueprint Setup
                              1. On BeginPlay, after the sensor is initialized I use 'Get Kinect Bodies' and store the array in a variable using Promote to Variable (let's call the variable Bodies). It doesn't matter if the users come and go, the NeoKinectBodies in that array will always be the same variables, so you can store it forever and keep reading their IsTracked property to know which are tracking a user at any time. Kinect will assign a user to a random index (could be 3, 0, 5 etc) so don't expect for the first NeoKinectBody in the array to be always tracking when there's an user being tracked. It could be any index in the array. The only sure thing is that Kinect will keep the same index for a user until they go away.
                              2. I create a separate variable of type NeoKinectBody (let's call it ActiveBody) that will store my active user and will be null if nobody's active yet (either because no one has appeared or because who was interacting went away or didn't fit the required conditions to be interacting anymore).
                              3. I will have a function that validates ActiveBody for tracking. That function will be called on Tick and it's where I check if the ActiveBody.IsTracked property is still valid and if it meets my custom conditions to still be the active user (like min and max distance, for instance). If any of the conditions aren't met or if IsTracked is false, I'll set ActiveBody to null (just use the Set node and connect nothing to it). Let's call that function ValidateActiveBody.
                              4. Another function that I'll create will serve to activate a user when there's none. Let's call it ActivateUser. ActivateUser will check for my custom conditions for the user to interact and will only be called for NeoKinectBodies with IsTracked = true. I could, in this function, check if that body is in a specific interaction position, if it's making a required pose (by reading it's joints data) etc.
                              The logic for every frame (Event Tick)
                              1. Check if ActiveBody is valid. If it is, call ValidateActiveBody to make sure its user still meets the required conditions to interact with the game. This function, as described above, will keep ActiveBody valid if the related user meets the interaction conditions or will nullify ActiveBody otherwise.
                              2. If ActiveBody is not valid, do a For Each Loop With Break on the Bodies array. Then, for each body:
                                • Check if IsTracked is true. If isn't, do nothing and skip to the next body.
                                • If IsTracked is true, call ActivateUser on it, which will validate if the tracked user meets the conditions to interact with the game.
                                • After calling ActivateUser, check if ActiveBody is valid. If it's not, just continue to check the next body. Otherwise, it means we have an active user and checking the remaining bodies isn't required, so we can just connect to the Break on the loop and that's it.
                              3. After all that logic and the loop, you'll have ActiveBody as either valid or invalid and, if valid, you can use it however you like.
                              Following these steps (if I haven't forgotten anything) you'll only have one user interacting with your game at a time and that user won't change until he stops meeting your interaction conditions. Even if Kinect is tracking other users and even if they come closer to the sensor than the active user (unless your conditions say otherwise).
                              Last edited by RVillani; 09-26-2019, 07:25 PM.
                              Freelancer Game Dev Generalist and Unreal Consultant | Portfolio
                              Unreal products: Dynamic Picture Frames, Neo Kinect

                              Comment


                                #60
                                Hey, I just bought your plugin, how could I achieve an accurate material using Color and Depth mask? I really need it quickly, and I'm not having any success :/

                                Comment

                                Working...
                                X