Announcement

Collapse
No announcement yet.

Unreal Engine for Full Body Motion Capture: In Depth review of Perception Neuron and IKinema Orion

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Unreal Engine for Full Body Motion Capture: In Depth review of Perception Neuron and IKinema Orion

    Hi all,

    A couple of months ago I started writing a review of the Motion Capture systems I used in the last couple of years, with the intent of recording a 30 minute review of both systems, but due to the time required to record and edit the entire thing together, I realized that it won't happen anytime soon, so I decided to adapt the reviews into a lengthy blogpost, describing how I've been using Unreal Engine as my main tool for Full Body Motion Capture, while giving and in-depth review of both systems, with some bonus tips and overall pricing, in order for studios and indie developers to have an idea of what they need in order to create an in-house Motion Capture studio, and how UE4 can help combining together different mocap solutions into a single one.


    Link to Blogpost


    Criticism and comments are welcome!
    ENTER REALITY

    VR Solutions

    Contact us for more informations

    #2
    Very helpful post, contains so much information.

    > Limited realtime character selection

    We met this problem too, and we managed to copy the rotation info from the IKinema male rig to our target rig in realtime, thus we don't have to pay 200£ for each of our characters.
    We are using Unity, but I think it's also doable in UE4.

    Comment


      #3
      Originally posted by fengkan View Post
      Very helpful post, contains so much information.

      > Limited realtime character selection

      We met this problem too, and we managed to copy the rotation info from the IKinema male rig to our target rig in realtime, thus we don't have to pay 200£ for each of our characters.
      We are using Unity, but I think it's also doable in UE4.
      Yep, is quite annoying and as soon as I have a bit of time I was also thinking of developing something in UE4 to get the data from the standard character and realtime retarget onto any character.
      If you used only rotation I guess that there's no IK, so basically is all FK data, correct?
      I already used a similar setup for Perception Neuron Vive Integration ( video here ), but IK is kinda the point of doing the realtime retarget because of the precision of optical tracking.
      ENTER REALITY

      VR Solutions

      Contact us for more informations

      Comment


        #4
        Originally posted by Enter Reality View Post

        Yep, is quite annoying and as soon as I have a bit of time I was also thinking of developing something in UE4 to get the data from the standard character and realtime retarget onto any character.
        If you used only rotation I guess that there's no IK, so basically is all FK data, correct?
        I already used a similar setup for Perception Neuron Vive Integration ( video here ), but IK is kinda the point of doing the realtime retarget because of the precision of optical tracking.
        Yes, we are using this on the virtual characters in VR, so FK data works fine for us because we can adjust our movements accordingly in VR. Otherwise, FK data may be not enough, just like you said.

        Comment


          #5
          Originally posted by Enter Reality View Post

          Yep, is quite annoying and as soon as I have a bit of time I was also thinking of developing something in UE4 to get the data from the standard character and realtime retarget onto any character.
          If you used only rotation I guess that there's no IK, so basically is all FK data, correct?
          I already used a similar setup for Perception Neuron Vive Integration ( video here ), but IK is kinda the point of doing the realtime retarget because of the precision of optical tracking.
          I saw your article and unfortunately they're not planing to add any gloves
          you wrote:
          "Unfortunately as of now, the Pro version comes without any fingers tracking, but they're planning to add those in short time, meanwhile you can add the fingers tracking by using the Noitom Hi5 VR Gloves and integrate them into the Mocap Setup."

          I contacted them and they said no, so new pro system is down for unresolved timeframe for now , it means no gloves for PRO noitom

          Is there any significant difference in mocap quality between PN2.0 vs PRO ?
          I think that mocap without gloves is 50% useless because hands are main part of your expression aside from face

          Comment


            #6
            Originally posted by WalterSulivan View Post

            I saw your article and unfortunately they're not planing to add any gloves
            you wrote:
            "Unfortunately as of now, the Pro version comes without any fingers tracking, but they're planning to add those in short time, meanwhile you can add the fingers tracking by using the Noitom Hi5 VR Gloves and integrate them into the Mocap Setup."

            I contacted them and they said no, so new pro system is down for unresolved timeframe for now , it means no gloves for PRO noitom

            Is there any significant difference in mocap quality between PN2.0 vs PRO ?
            I think that mocap without gloves is 50% useless because hands are main part of your expression aside from face
            I didn't know that they put on hold the gloves for the Pro version, thanks for the info.

            Pro is way more stable then 2.0 and due to how they designed the sockets to hold the IMU sensor, magnetic interference is less frequent, and since I got it I never had any magnetic interference issues, even with my hold habit of leaving the sensors on the suit, instead of storing them in the box.
            The removal of the cable is a huge improvement, since during some mocap session, I experienced issues due to the fact that I pulled the cables, so an entire leg/arm will completely stop to work, and this is something that never happens witht he wireless setup.
            On both I never experienced drifting, so I can't comment too much on that.

            Regarding the fingers, consider that a lot of mocap solutions ( IMU or optical based ) do not take into account any type of finger tracking, and as far as I know only recently the "big guys" implemented finger tracking ( custom or using available solutions ), so that part of mocap was most of the time done by an animator by hand/poses, rather then with mocap.
            Consider also that IKinema Orion does body tracking only, so the integration of the fingers is something I developed because I had enough of animating fingers by hand

            Considering that the pricing for available mocap solutions are quite low, I have to say that if you want to get into this business, you don't need to get OptiTrack or Vicon, but you can get good enough results with the many alternatives there are around
            ENTER REALITY

            VR Solutions

            Contact us for more informations

            Comment


              #7
              Originally posted by Enter Reality View Post

              I didn't know that they put on hold the gloves for the Pro version, thanks for the info.

              Pro is way more stable then 2.0 and due to how they designed the sockets to hold the IMU sensor, magnetic interference is less frequent, and since I got it I never had any magnetic interference issues, even with my hold habit of leaving the sensors on the suit, instead of storing them in the box.
              The removal of the cable is a huge improvement, since during some mocap session, I experienced issues due to the fact that I pulled the cables, so an entire leg/arm will completely stop to work, and this is something that never happens witht he wireless setup.
              On both I never experienced drifting, so I can't comment too much on that.

              Regarding the fingers, consider that a lot of mocap solutions ( IMU or optical based ) do not take into account any type of finger tracking, and as far as I know only recently the "big guys" implemented finger tracking ( custom or using available solutions ), so that part of mocap was most of the time done by an animator by hand/poses, rather then with mocap.
              Consider also that IKinema Orion does body tracking only, so the integration of the fingers is something I developed because I had enough of animating fingers by hand

              Considering that the pricing for available mocap solutions are quite low, I have to say that if you want to get into this business, you don't need to get OptiTrack or Vicon, but you can get good enough results with the many alternatives there are around
              Thank you for comprehensive answer !
              I'm actually looking for all in one solution for fingers face and full body
              I want to have system that will fully express character when I perform
              I'm a experienced in maya and max but beginner in UE
              as I understand you have some solid background in mocap
              I need some information I don't even know where to start it's too much information and solutions but looks like
              none of them (except very expensive ones) can capture everything in live at once

              I have my eyes on faceware which is very good in quality and looks like there is perception neuron with gloves only solution
              I don't even know can I have PN + faceware and fully capture in UE
              if I understand correctly only difference in PN2 and pro is strap stability that gives quality not sensors itself are better
              any advice can you give what system would you assemble if you would have that goal to capture everything ?

              PS
              what I don't like in UE is the face quality I need to have something more detailed that game engine face
              I want to have face animation something close to snappersmocap

              Comment


                #8
                Originally posted by WalterSulivan View Post

                Thank you for comprehensive answer !
                I'm actually looking for all in one solution for fingers face and full body
                I want to have system that will fully express character when I perform
                I'm a experienced in maya and max but beginner in UE
                as I understand you have some solid background in mocap
                I need some information I don't even know where to start it's too much information and solutions but looks like
                none of them (except very expensive ones) can capture everything in live at once

                I have my eyes on faceware which is very good in quality and looks like there is perception neuron with gloves only solution
                I don't even know can I have PN + faceware and fully capture in UE
                if I understand correctly only difference in PN2 and pro is strap stability that gives quality not sensors itself are better
                any advice can you give what system would you assemble if you would have that goal to capture everything ?

                PS
                what I don't like in UE is the face quality I need to have something more detailed that game engine face
                I want to have face animation something close to snappersmocap
                I already developed ( and sold ) full body solution to many developers, for both Full Body Motion Capture, but also for Virtual Production and YouTube sketches comedy, so I have quite the experience on this one.
                The solution I created is using either PN or IKinema Orion, witht he Hi5 VR Gloves and the iPhoneX, which does a good enough job on tracking your face, but not at the same quality of Faceware.

                If you're interested in this kind of solution feel free to contact me
                ENTER REALITY

                VR Solutions

                Contact us for more informations

                Comment

                Working...
                X