Tried, didn’t work…BP setup is exactly like yours, as shown below.
“Keys” is currently connected to Tick, also triedo to connect to Begin play, also tgried to see if after the branch for Foot Right, on True I added a Print String, but nothing shows up, so is not finding the Foot Right.
I have 5 tracker in total, each one sent to the bollean state to see if they’re close to the Right Foot device pos/loc, but so far nothing works unfortunately
You need to execute it once after calibration. And in ‘Branch’ nodes attach Cube to corresponding motion controller component to not to update positions in every tick.
I assume, you have five motion controller components with MotionSource == Special_1 - Special_5? Do you use direct VR input or input from components? For input from components, it wouldn’t work. For direct input: keys in TrackersData map are equal to IDs of SteamVR tracked devices. For input from components: keys in TrackersData are equal to indexes in array of components used to initialize input.
Screenshot seems ok. I suggest to use DrawDebugSphere and DrawDebugLine functions to visualize positions of SteamVR tracked devices and motion controller components. Probably motion controllers have some offset because of Low Latency update, and you simply need to increase Error Tolerance in NreqlyEqual nodes to few units.
Wow, just tested it thoroughly for a couple of hours (with the elbows and knees), can’t believe how accurate it is. Orion doesn’t give us the ability to track the knees. So if I were to get 2 more trackers, this system would be better and more accurate than Orion. Basically an FK rig. Will get this product asap, thank you @ for updating the executable demo. Couple questions I have, why is it limited to 60secs? And can we use our trackers as a prop using the “Vive mocap Kit”? Only method I can think of is using Sequencer Recorder
@GavinCrout1030
I use a console command ‘RecordAnimation’ to start and stop capture from within VR Preview. It’s limited to 60 seconds. If you setup Sequence Recoder for capture beforehand, you, of course, could save longer animations.
Hi ,
Currently Keys is connected to Event Begin Play, so is executed just once.
I added 5 motion controllers components in the BP CapturePawn, each one assigned Special_1, Special_2 and so on, and I’m using each one to compare tghe position of each tracker to the position of the “Right Foot SteamVR ID”.
I did as you suggested, and used a DrawDebugSphere to see the position of the “Right Foot SteamVR ID”, and I notice that is always getting the position of the HMD, not the right foot, so there must be something wrong somewhere, since I can’t locate the right foot Tracker properly
Mmhhh…not sure if I understood correctly, but at begin play I’m able to see all the trackers I’m currently wearing, so my idea was to be able to attach a mesh onto the feet trackers, but before the first part of the calibration even happens, so basically as soon as the level starts I need to execute that setup, so that I can see the mesh attached to the feet.
If this is not possible, I was thinking to a possible way around, meaning checking if any trackers are below a certain height ( ie 10cm ), and if so, compare on the X values, checking which one is the one on the left and which one is the one on the right…doable?
CalibrateBody() function fills TrackersData map, which is simply binding between trackers and bones. Before this function was called, the plugin doesn’t know what trackers are for feet. I have a function to predict future calibration (PredictBonesCalibratoin, isn’t available yet in the Marketplace version), but, just as CalibrateBody() function, it requires T-Pose. Although, as you only need feet, it would possibly work for you. PredictBonesCalibratoin function do exactly what you described to detect feet: looks for lowest trackers to the right and left from the headset.
So I can submin update for the plugin - it’ll take a few days - or give you direct link to download the lastest version with this function.
I just got the product. I have one more problem, after I calibrate, the arms can’t fully extend straight. When the arms are “straight”, it’s really bending at a 45degree angle when it shouldn’t be bending at all. Is there any way to fix that?
Also, manually calibrating doesn’t help AT ALL. Whenever I move the controllers at the slightest (while manually calibrating), it completely messes up the rig somehow
Looked at the previous comments, [USER=“2200”]Enter Reality[/USER] had the same problem as me. the “arms not fully stretch” problem is what I’m having. So is the 4.19 demo more capable of fixing that problem?
You can tweak the arms calibration, go under Capture Device > Details tab > Setup > Arms ( something similar ), and you will find a value of 1.
If you tweak that value ( I put it at 0.85 ) you can offset the IK handle, allowing you to have the arms fully stretched.
I guess that this issue is based on a person arms length, so not everyone is seeing this behaviour, but please also be sure to have “Enable Scaling” in the AnimBP, because that solve the issue without tweaking the “Arms” value
That definitely helped the case a bit, my arms are now stretched finally, THANKS. But now my actual hands are too far in front of the characters hands. I’m trying to fix that problem by manually calibrating. But whenever I move any of the controllers by the slightest, it ruins the rig completely.
I thought the hips would be fine without 2 trackers but it seems like the flexible spine doesn’t work with just 1 tracker (I do have it on btw). Maybe I should order another tracker, but I know it’s definitely possible to achieve it with 1 tracker
Can you show screenshot or, better, video? IMO it’s not about arms scale, and you have some problem with calibration or setup. Flexible spine works with one tracker, it should track pelvis (not ribcage).
Ed1. Also, check twice names of bones in preset actor and in the used skeleton. If you retargeted Mannequin’s animation blueprint for custom mesh - remove fingers animation (Layered Blend per Bone nodes) in anim graph. If your mesh has different names of bones, it could break hands, because retarget manager doesn’t update names of bones in Layered Blend per Bone nodes.
Ed2. And disable capture of root motoin (in the menu, before calibration).
[USER=“2200”]Enter Reality[/USER]
I’ve submitted the update. PredictBonesCallibration() function is what you need. Call it in BeginPlay and use the map it returns instead of TrackersData as described in the previous page.
Also, I slightly changed settings for non-default usage of headset and controllers: https://pp.userapi.com/c850616/v850616698/3a29e/Lq-Smxi6zkg.jpg
If you was using Vive Trackers for hands, change ‘Motion Controllers Tracking Role’ to "Don’t Use’ (ignore right and left controllers) or ‘Any Role’ (use right and left controllers to track any bone, not just hands).
FYI the spine IK setup you created a bit ago works perfectly with just the pelvis tracker and the headset driving the spine itself, so the issue that Gavin is saying it’s probably something else.
I’ll try to show a screen record, just need to figure out why the VR Preview is showing a black screen. Is it possible to screen record the vive’s screen?