Vive Mocap Kit - Support

Link: Vive Mocap Kit in Code Plugins - UE Marketplace

For support, please write me at e-mail.


To capture animation in UE4/UE5, download main demo project (ViveMocap) from the Marketplace Description page. Example of VR real-time avatar without all additional stuff is called “Simple Demo Project”.

At UE5.2 the plugin still uses SteamVR (rather then OpenXR) for many reasons. If you own copy of the plugin and need OpenXR version, please write me at contact e-mail.

How to Start

Download and open Vive Mocap project. It has two maps: MocapMap to capture in VR headset, and NoHMDMocapMap (should be launched as PIE New Window) to capture without HMD. Yes, it’s possible to use only Vive Trackers without HMD and controllers, but you need to modify config files in SteamVR for such setup. You also can use NoHMDMocapMap with passive (unused) HMD.

There are 3 presets on both maps: UE4 Mannequin, UE5 Manny and MetaHuman. To select one of them, select SkeletonPreset actor and its Details tab click “Set Default” button.

Launch map (VR Preview or New Editor Window) and follow instructions for calibration. At least 4 points should be tracked: [1] head, [2-3] two hands and [4] spine (either hips, which is recommened, or ribcage).


  • Select roles for all used Vive Trackers in SteamVR dashboard (Settings → Controllers → Manage Vive Trackers). It doesn’t matter what role you apply, but it should be initialized to read information about tracking status and filter out position errors.
  • You can use up to 2 Vive Trackers attached to the same bone to imrove tracking qual;ity and stability. There are 2 bones to track spine: hips and ribcage. In other words, up to 4 Vive Trackers can be attached to spine.
  • I recommend to attach spine trackers with different angles. For example, tracker at pelvis shifted to the right, and ribcage tracker shifted to the left.
  • Keep in mind tracking conditions: for some animations, it makes sense to attach hips or ribcage tracker at the back.
  • ViveMocapKit is using IK model. So, it’s enough to attach trackers either to calfs or thigs to capture legs animation. It’s also possible to track either forearms or upperarms, but :bangbang: trackers attached to upperarms are preferable :bangbang:. They allow to compute not just elbows position, but also rotation of clavicles.
  • if you have trackers on uppearms, don’t skip clavicles rotation offsets in the calibration menu – adjust them until you get fair clavicles animation.
  • It’s possible to exclude some Vive Trackers from body tracking. Setup isn’t quite obvious, so feel free to contact me.
  • Also, it’s possible to use any other tracking system (including optical) with other tracked devices then VIve Trackers.


New assisted calibration mode on NoHMDMocapMap:

Outdated, but still useful: Vive Mocap Kit - Setup custom mesh to capture animation - YouTube

Setup in blank project (for VR real-time avatar): Vive Mocap: body tracking setup - YouTube

Capture without VR headset right in UE5 viewport even without starting Preview: ViveMocapKit Tutorial: Mocap in UE4 Viewport - YouTube

SteamVRTrackingLib Plugin

SteamVRTrackingLib is a helper plugin with additional functions used in the ViveMocap demo project. ViveMocapKit doesn’t need this plugin to work, but not vice versa. In particular, SteamVRTrackingLib serves to track SteamVR devices in UE4 viewport and to bind Vive Trackers to static names. Read about SteamVRTrackingLib features here:…r-1507eb07eff3

If you owned ViveMocapKit plugin, just download latest ViveMocap project on the Marketplace page. SteamVRTrackingLib is inside of the ViveMocap/Plugins folder.


I’m starting. I see many devs buy it to create player’s avatar in a games, not for motion capture. First small note: don’t forget to reset UseForMotionCapture check box (it’s enabled by default) in the component’s settings to keep skeletal mesh scaled to player’s height. The plugin was developed mostly for motion capture, so please write me if you face any troubles or something works just not exactly as you want.

Secondly: if you use it for motion capture (like me personally), do you need some video tutorials about importing custom meshes, animation export and processing?


This looks great. I need to grab a vive now and some trackers. Looking forward to this. Perhaps you could give us animator noobs a bit of a “high-level” overview on a usage scenario for creating an animation for an NCP with an animation. I subbed to your youtube channel and see the demo character animation, but how you set that up would be of interest.

I’m noob in animation too, so it’s possible that I do a lot of things wrong. I also know Maya is more suitable for animation, but I only know 3ds max. And I use the follow rig:

  1. At first, I imported a version of my mesh skinned to CAT skeleton to VMK project. It isn’t necessary, I could capture animation at the default UE4 Mannequin and then retarget it in Motion Builder or Maya. But this step allow me to avoid additional retargeting. Retargeting can be painful, especially if you use root motion, because retarget tool from Autodesc doesn’t retarget root bone.
  2. At seconds, I just capture animation (like on video above) and export it to fbx.
  3. Then I import fbx to 3ds max as a parallel skeleton and do retargeting to my CAT (menu Animation - CAT - Catpure Animation). I don’t retarget limb bones, only IK targets.

At this stage I have normal captured animation 100% ready for editing. So I can clear up frames I don’t like, remove noise from feet targets, adjust hands positions etc.
Then I do layers collapsing with some step (default 5 frames step works good for simple animations) to smooth animation and reexport it back to UE4 via fbx.

The version is pending approval and isn’t available via launcher yet (2017/11/6).

Version 1.0.3

Bug fix: root bone translation and rotation. You can enable/disable root motion in Setup menu. If root motion is disabled, root bone is always in (0, 0, 0).

Update only affects version for UE4.18. Don’t forget to update project files please.

Is Network Replication on your schedule? If so , when will it be released? I’m really concerned about it.

Unfortunately, I can’t give you any date. I’ll make it when i need it for my job. Definitly not in December.

Well thanks anyway , it’s a great plugin that will make developers’ life easier :slight_smile:

Update 1.0.4

Bug fix: root bone rotation (if root motion capture is enabled)
Bug fix: skeletal mesh scaling (when using for VR avatar, not for animation capture)
Bug fix: inaccurate hands positions when scaling is enabled
Bug fix: improved support for custom skeletons
Bug fix: crashes on start if bones map has invalid names

New feature: motion controllers can be used to track other bones then hands
New feature: logging

Don’t forget to update project files.

Thank you for the great plugin! I have been following the instructions in the pdf for adding a new skeletal mesh and ran into an issue where my elbow and knee joints bend opposite of how they normally would. Do you know of a way to fix this? I tried it with two different skeletal meshes.

Are limb twisted or just bended wrong side? If you send me your skeletal mesh (to, it would make debugging easier.

The limbs are not twisted, they are just bending backward. I cannot share the skeletal mesh as it is a purchased asset.

1.0.5 Update info

New settings:
(bool) CaptureDevice::InvertElbows
(bool) CaptureDevice::InvertKnees

In some cases elbows or/and knees could be bended backward. This settings allow to fix this bug manually.

Next update will include support for skeletons without root bone (like Mixamo) and universal limbs rotators (which makes InvertElbows and InvertKnees unneccesary).

Is there any workaround for the 60 second limitation?

It’s possible to capture animation asset using sequencer instead of console command, and I believe sequencer doesn’t have this limit (but I never tried it). And it can’t be turned on and off at runtime.

For me it’s not a big issure. I usually simply reactivate animation capture when it stopts and get a set of 60-seconds animation sequences.

Update 5/4

  1. New scaling system: separate scaling factors for arms and height.
  2. Scaling flag moved to animation blueprint (now can be changed in runtime).

  1. Networking replication (wasn’t tested yet).
  2. Bug fix: shoulders animation

Update 5/16 (1.0.7)

  • knees orientation fix (when knees aren’t tracked)
  • improvements for using in games (calibration)

Basic Game Template:…l0QmPzR7IddLU1

Update 5/18 (1.0.8)

  • networking optimization
  • input from scene components


Do you have a C++ or clear blueprint project?

For a future communication, could you write me at e-mail ( Unfortunately, forum sometimes doesn’t send notifications about new messages in my topics.

Hi Yurinik,

I purchased your plugin, and am having issues with using 4 lighthouses with my vive pros. It’s a similar issue to the issue you resolved with the patch for more than 3 trackers as the lighthouses just count as additional devices, and the patch works for me in editor launches, but i’m not able to get the patch working in a packaged game. I get an assertion failed key is invalid error.
Hoping you know what would be needed to get it working in packaged game as well.

I posted the details in your patch forum page, but wanted to include a link here in case you check this one more often.

I also sent correspondence to those I know at Epic to see if the pull request can be expedited at all. Hope it helps.

Thanks for the patch btw!

Let’s continue here: