Anyone used the new Stereo HMC for metahuman facial animation?

Hey,
Just wondering if anyone has gone through the pipe for the new facial animation auto solving from HMC using a stereo HMC.
The guide was written for iphone and I fail on step one trying to use stereo hmc.
Made a capture source. Set it to stereo hmc archive. Set the path for the HMC.
Opened the capture manager, selected the capture source I just made.
and.
Nothing. No footage to ingest.
I have footage in mov and mp4 - but neither are found. I assume I am doing something dumb - but being a new feature there isn’t much in the way of guides…
thanks

Hi, Im trying the same.

There was a tweet a bit ago from the unreal account.

“Hi everyone! A lot of you asked why MetaHuman Animator only works on iPhones so far, it’s because it utilizes Live Link Face app, which leverages the iPhone’s TrueDepth front-facing camera, as well as its normal front-facing camera. We’d love to support more devices in the future as near-field depth sensors become more widely available.”

But there is an option for stereo footage, so Im also wondering what the make of the MHC you are seeing in the videos released from Aarons Sims and so on.

Will be great to figure out :smiley:

Yep, Bryan Steagall ( Kidz Korner Studios ) used footage from Dynamixyz

Hey there! So documentation is still from the beta program. But there are some nuances to the workflow you can find in the article. Read everything twice in that article.
Like the folders they reference are different in beta than they are now /Metahumantest/ is now /Metahuman/. See how my reference to the calibration board is /Metahuman/ now.

Stereo Camera Calibration and Tools | Epic Developer Community (epicgames.com)

Condense it to the simplest of terms.
you need a calibration board which is located here.
C:\Program Files\Epic Games\UE_5.2\Engine\Plugins\Marketplace\MetaHuman\Content\StereoCaptureTools

Once you have the calibration video and files you should be able to follow the articles to break down the videos into imagesequence json folder. From there you can bring it into capture source because it sees that JSON file. Hope that helps.
GL