Vive Mocap Kit - Support

Nope: https://pp.userapi.com/c846420/v846420557/12c376/RYjZS-ErCU0.jpg

Yes. Sequence Recorder works usual way. To capture pure positions of HMD and controllers (not in bones hierarchy), you need 3 bones attached to root bone and not affecting mesh. Then, attach them to mesh bones in ‘SocketsMap’: https://pp.userapi.com/c850216/v850216557/6f366/43-_476R-7g.jpg

Ohh ok, I see what you mean… so we just need to parent them under the root bone and add them to the SocketMap. But where do we find the HMD, Wands and Trackers? I might have an idea where the HMD and wands are located, but I have no idea where to find the Trackers.

I figured out how to do it inside of Unity. It was straight forward because all I had to do was add the tracker as a component to an empty group and use a 3D object as a representation for the tracker. But I want to use that method in collaboration with VMK, I don’t know if the concept is the same as Unity.

Hm. Yes, you can’t capture raw controllers and HMD - only hands and head with some static offset. But I can add it easely, no problem.

That would be great man! Thanks in advance

Hello, I just upgraded to 4.21 and now I am unable to use the plugin. I have 4 trackers with the headset and two controllers and when I launch the project it no longer recolonize the wand controllers so I am unable to do anything. What do I need to do to get the wands to work inside the project?

I’ve always wanted to mocap some fight scenes and this seems like a great solution, although ofcourse this along with several tracksers is a big investment.

I would like to get some opinions regarding how challenging it would be to animate two characters fighting each other. I assume this would mean animating one character and playing that anim back while later recording a new mocap to respond with the previous anim. Is this more than feasible? Also has anyone implemended IK to the degree that live mocap could influence an existing anim?

Here’s an example of some VR fighting IK which looks pretty cool:

Do they work in another projects? You can start VR Template and add a few motion controlelrs for trackers to check.

This plugin only calculates avatar body for a player and doesn’t include a functional for interaction with environment. I. e. you would need to use tracing for punch (because fist shoudn’t dive in other character) and use physics for reaction. That’s a lot of work.

I am not really familiar with unreal to check that. When I open the 4.20 project it works there except the character falls to the ground once I stand in the T pose and press the trigger button.

I’m having an issue where one of the avatar legs will be locked in a “karate” kick position. The leg points at the same direction no matter which way I’m facing. Any help would be appreciated.

I will be monitoring this post throughout the day let me know if more info/pictures are needed.

Thanks

Hi,

Just a quick question. Is this plugin ready for multiplayer? If so, does it have any tuts on how to correctly set it up for multiplayer?

Hi ,

many many thanks for your fantastic plugin which i purchased a week or so ago and have finally got around now to trying.

I have just a couple of questions. Firstly, when i run the level in unreal the level display screen on my pc display turns blank. I understand that this may have been done purposefully to save processor / gpu time etc. but is there anyway that this could be toggled on or off at all via the keyboard or something? It is just that i would like to be able to sit at my computer and see what the motion capture actor / actual data being saved looks like in real time so that i can have a chance of re-directing the actor as we go along if there is any errors at all etc.

I have also had a few problems with the trackers/HMD losing synch it seems and the mannequin getting flattened / distorted for some reason from time to time. I am not too sure what might be causing this but will try and keep an eye out for any possible problems.

Also, would it be possible for us to stop / start / restart the recording via the keyboard also please? Thank you :wink:

Yours sincerely,

Joby Wood

@jobywood
Hi,

It doesn’t save a lot of GPU time actually. I did it once for my personal reasons and forgot to disable before uploading the project. Open VMK/Blueprints/BP_GameMode_HMD and change spectator screen mode in BeginPlay event from “Disabled” to “One eye cropped to fill”

It could be caused by occlustion of base stations. To avoid that, I usually attach the pelvis tracker closer to left or right side, not exactly in front. Pelvis tracker is the most important. If it lost tracking, the whole mesh would be distorted.

I’ll reupload the project later, but jou can easely add it right now. Open VMK/Blueprints/BP_CapturePawn and add this:

That’s great!!

Many thanks for your help :slight_smile:

,
Is it possible to use the Vive trackers as hands and the vive controllers as other body trackers? Is this already supported, or would that require modification of the blueprints? Also, how easy is it to implement finger tracking data realtime into the model, instead of the triggered preset hand poses?

There is a parameter in the properties of the CaptureDevice component, but it isn’t exposed to VR menu. I. e. you would need to open BP_CapturePawn blueprint, click at CapturePawn component and in “Detailes” tab change “Motion Controllers Role” from “Hand only” to “Any role”.

It depends on the hardware (LeapMotion, VR gloves). I work on universal animation node for this purpose, but it isn’t ready yet.

Hi again ,

just out of interest, if it is possible to use Sequence Recorder to record animations that are longer than 60 seconds, then can you tell me please what is the name of the actor that we will need to add?

Thank you.

BP_CapturePawn_C. Select it in the Sequence Recorder after launching VR Preview.

Thanks, I just preordered the VRFree gloves, for VR, and also hope to implement the finger tracking (via their Unreal SDK) into your plugin.

Hi,I’m testing demo. And I have a enough space to use room scale, but on Manual calibration, mannequin is inside walls so I can’t do manual calibration. I change display angle 90 deg by room set up, but this application fits main surface to wider side, so I can’t use manual calibration anyway.

Is there a way to solve this problem ?