Perception Neuron/Salto3D data quality?

Hi there,

I’ve looked at several very convincing demonstrations of the Perception Neurons, for producing mocap animation data. But before i go place an order with them, i would like to hear if someone can post a few unmodified fbx data files with some mocap data to see what quality it produces in terms of max frames per sec and precision and jittery.

There’s also a Danish company Rokoko, that are currently in the process of producing a mocap suit which im also considering. It would be nice to see a comparison of raw mocap data in order to evaluate how much post processing you would need from each of these two suits. Since the Salto3D suit is not yet ready (estimated around September 2016) we can’t currently do any real comparison, but could still look at what the PN can do.

So if someone owning the PN and reads this and would be so kind to post a test fbx with some data, i would be very greatfull.

Thanks in advance.

Here are some raw fbx files straight from Perception Neuron:

Keep in mind I’m still new to the system so this might not be the best representation of what the system can do.

I have the 32 sensor kit, but have been using it as though it were the 18 sensor kit version for convenience, reduced complexity, and from what I can tell the capture data from the fingers isn’t that useful and so I just work the fingers in post production (but I haven’t really experimented all that much with the fingers).
If you do go with PN and money is tight, I would recommend just going with the 18 sensor version.

As far as FPS goes, PN captures at 120 fps if using <= 17 sensors and 60fps if using >=18 sensors.

Thanks my man. Much appreciated. I’ll take a look at it very soon and post whatever i find in it. Also great info about the fingers. Maybe someone else can also pitch i. With their experiences for the finger tracking.

I’ve looked at one of the animation files and the curves look very nice. There are some minor spikes here and there, but none of the weird twists i get when i track with ipisoft mocap studio. So from my point of view, it’s very nice quality data they deliver. As with any mocap equipment i guess, there will have to be some foot sliding fixes, but still it looks to be pretty straight forward to work with.

You can find some videos in my thread:

Perception Neuron can capture motion data in real time and accuracy is very high.
Sliding foot should not exist under normal circumstances, I don’t know how do you know that.

I’ll take a look at it thanks. I looked at the animations and if you look carefully, you can see the toe joint is moving a bit when it’s supposed to stay still. It is always going to be an issue with mocap i bet to more or less extend, but as i said, it’s not that bad (to my standards atleast).

I have saw the sliding foot problem you say in the other peoples plugin demo videos, that is not a correct implementation, hope don't mislead you. The sliding foot problem is due to the limb length are not completely match the motion capture data. There is a function in AXIS called "Displacement", if the Unreal plugin dont support Displacement, sliding foot problem will occur.
You can download a AXIS to experience it.
Hope this will help you.

I strongly suggest you to play a bit with the joint stiffness values and the smooth factor, mostly because you can ease out some of the unwanted joint extra movement.

In general, retargeting the animation from Axis Player in Maya ( or any other software ) mocap is never 100% perfect, so some degree of tweaks are needed.