Best home mocap solution?

Hello, I see a lot of hype about kinect 2, but is it worth the fame?

1 What home mocap can and can’t do?

For example, I need a few eating and drinking animations, is it capable of capturing that?

  1. Best technique to capture couple dancing? Or couple eating? Is it possible to have 2 persons simultaneously in front of the kinect so it would capture both? Or do I have to capture them one by one? But their mocdmemov won’t match them, if it’s a dance

  2. What is really the best home mocap solution? Upto 300 usd

Kinect is dead lol

What is to lol about? It makes you happy or what?

It’s a usable technology that works regardless of its being manufactered or not

so buy a used one

depends on the “software” you use for mocap. Some can only use 1 kinect. some can use 2 -3 kinects.

when using kinect v2 , you can only use one kinect v2 per PC, but with ipisoft, you can use 3 computers with 1 kinect v2 each, and capture motion at 3 different angles.

Ive only tried using 1 kinect (the old kinect) … the capture was ‘ok’. for testing.

also i dont think kinects )for motion capture) can detect fingers :frowning: or head rotations … unless you use some accessories.

I have Kinect 2 in PC and proved all the software for motion capture…not working very well.

I tried one Kinect v2 with iPi Studio. Quality of animation - meh. Just bad. They say capture is much better with two Kinects (even v1), but I’m not sure if it worth it.
Have you considered Perception Neuron?

yes, the 3 kinect v2 (using 3computers) are networked together, I have never tried that setup, as i only have one kinect v1,

you already have leap motion? do you not like it?

A solution that’s cheaper, if you have a VIVE, is Ikinema’s Orion. Its not perfect, as you do have to hand key fine details like finger movement, and often head movement to an extent. But the results are usually very accurate, and very clean.

Note if you don’t have a Vive, its not that much cheaper then a Neuron. But if you do, you only have the cost of the software, and the three extra sensors.

The issue with the Neuron kits is that there aren’t enough sensors and you end needing to buy more which then becomes even more expensive.

As someone who has worked extensively with Kinect & Kinect 2, there is only so much you can do with capturing motion data with it. Firstly, the data is going to be very noisy and secondly, you’ll not get the desired accuracy. I second Mithos56 on Orion as it is a good solution.

That said, if you just need simple mocap animations for the actions you described, you might just want to utilize the mocap data from CMU (The Motionbuilder-friendly BVH Conversion Release of CMU's Motion Capture Database - cgspeed) and modify that if needed.

I feel mocap still has a higher price cutoff, and unfortunately 300USD may not get you the kind of results many might desire.

Yes, for a proper mocap you need at least 30 sensors, otherwise it’s iseluse

Orion adds $400 or so. Not much cheaper. But Vive Trackers give good quality indeed.

I own both IKinema Orion and the Perception Neuron Mocap Suit.

If you don’t have the Vive I suggest you to buy a Perception Neuron, which does work pretty good, but requires a lot of post-recording cleanup and tweaking, especially because the precision, due to the IMU sensors, is not the best.

Orion does its work very good ( I’m using the 8 Tracker configuration ), but it does also require cleanup after the recording, but is far more precise and very easy to use.

Interesting. I was going to buy Perception Neuron because I thought it’s better then Vive.

It’s very versatile, you can do a lot with it and it allows for very plug&play setup, especially if you want to stream the data directly inside Motion Builder or UE4, and I used those streaming plugins A LOT for my projects, so you have everything you need.
Now they released a 2.0 version of the suit, which has stronger straps and overall better design, but the core of the mocap itself is still the same.

Versatility, wider range of use ( I used in a 30mx20m room without any issues ), no tracking cameras

Wifi setup can become a nightmare, and sometimes the hub won’t connect and it takes 10 minutes for the setup, while sometimes it takes 5 seconds.
Precision is lacking due to the IMU sensors

Overall, for that price, is a great product :wink:

So you recommend Vive over other products? Can it capture some drinking animations, finger animation? Thanks, bud!

It depends what kind of animations you need and what degree of precision you need.

Vive combined with IKinema Orion does its job really well, but as of now there are a couple of things which does not make it the best possible solution.

In my case I create my own mocap solution in UE4 using Sequencer, and I also added into the entire setup a pair of Noitom Hi5 VR Gloves, so that I could record both full body animations and fingers animation as well, which is a very handy solution ( pardon the pun ) and lets you have a more accurate result, especially because 99% of the time you’re going to animate fingers in post.

Very accurate
Plug&Play setup ( 2 minutes and you’re good to go )
Realtime streaming ( only to UE4 though, no Maya/Motion Builder which would be handy ) and offline recording ( FBX and BVH )


  • No possibility of customization in UE4, so basically you can’t tweak hands position or anything else, and this is an issue with very precise movement ( hands/fingers animation )
  • You can only stream to the UE4 Mannequin, or use the IKinema Male/Female characters, so if you have a custom character you need to pay 200£ to IKinema to have the retargeting and be able to stream the animation in realtime…which is something that really makes me question the usability of the entire system, since one of the first thing you would do is to have realtime animations applied onto your character
  • Retargeting on the UE4 Mannequin is a bit off on the shoulders, but everything else is ok

If you own a Vive this is a good solution for home mocap, and also now with the price of Vive1 on sale, yep, still good!

Sample here:

With Perception Neuron you need to take extra precaution with the proper user calibration and especially related to cause drifting to the sensors by using mouse/keyboard while wearing the entire suit.

No tracking cameras limitations
Free plugins for lots of softwares ( UE4, Unity, Motion Builder, iClone, and so on )

Not so accurate ( because of the IMU sensors )
First time usage can become a nightmare, especially if the wifi doesn’t work as expected
Forum support is very basic, but the neuron support email does answer quite quickly
For good results you need to test different setups, so it might take time to get very good results

Sample video here:

Overall I think both of them are good solutions, so basically now for the same price you can buy Vive+Orion or Perception Neuron, so it’s really up to you which one to choose.

[USER=“2200”]Enter Reality[/USER]
Thanks, your explanation is very helpful. In the video it looks like IKinema’s IK model isn’t very good for mocap, because it drags hip when hands can’t reach trackers. IMO hip position should have a priority to simplify post-processing (or at least it’s so in my animation cases). Can you also explain how body to tackers calibration works in Orion? Do you use a VR headset or just trackers?

The video was done quite quickly, since then I also upgraded the straps ( which were wobbling a bit at the time ) and also did some minor adjustmnent to the Json file which takes care of the calibration pose of the Mannequin, in order to tweak a bit the position of some joints ( shoulders especially and head ).

From reading the docs the calibration is based on the tracker being in a predefined area when the calibration starts, meaning that the body is divided into different parts corresponding to each tracker location, and from there I guess that they take the world position/rotation, attached them to the IK onto the entire body, and drive the animation like that, but also doing a bit of smoothing during frames, because I haven’t notice any twitching or weird things during the recording.
In the video you can see that one of the shoulders pops in place, but that was due to one of the Lighthouse being not placed efficiently.

I’m using 8 Trackers in total and I’m also developing a project where the user will wear 7 trackers and the HMD, and they also supply a setup where you can test the 7 Trackers+HMD setup, and it’s very fun to play with!
Overall I’m quite satisfied with the results, but the fact that you have to pay 200£ to have a custom character in UE4 you can drive with realtime animation still infuriates me, because it should really be a part of the deal, otherwise the streaming to UE4 is almost useless.