Ingesting iPhone depth footage problem

Hello, I am trying to ingest depth footage from an iPhone 12 using the capture manager in Live Link Hub. Im in UE 5.6 github version. Everything connects, I que the vids and process them. When I go back to editor it reports that there are new files that needs to be imported, which I do. It seems it eventually gets to the calibration.json file and doesnt know what data asset type it is. I searched thru the available types and could not find anything that has to do with calibration, depth, lens, and many other types I would think associate with a Metahuman animation calibration. I attached a screen shot of the dialog that pops up.

Does anyone know what I am supposed to do?

Hi John,

My first thought is that some of the MetaHuman Animator plugins are not enabled in the Unreal Engine project.

Please check you have the following enabled;

- MetaHuman Animator Calibration Processing,

- MetaHuman Animator Depth Processing (available from Fab),

- MetaHuman Animator,

- MetaHuman Live Link (only required for realtime animation),

Thanks,

Mark.

Thank you. And yes, MetaHuman Runtime should not be required as it has been superseded by the MetaHuman SDK plugin.

Has this worked previously for you (and so it is a recent/new issue) or are you trying to ingest iPhone depth data for the first time?

Please can you share the output log from both Live Link Hub and Unreal Engine (you should be able to attach them to this thread)? And check I also confirm that you are using Windows (MetaHuman Animator is not yet supported on macOS and Linux).

Thanks,

Mark.

Hi John,

Unfortunately the full log shows as not being available so I am unable to download it.

However, based on the error message, I think we need to dig a little deeper into your project setup.

When ingesting iPhone footage using Capture Manager in Live Link Hub, there is a choice of using the Live Link Face or Stereo Video Devices. The Live Link Face Device allows you to connect to the iOS device over the network, whereas the Stereo Video Device assumes the data has already been downloaded. I assume you are using the latter of these.

The data captured on the iPhone should include a depth_metadata.mhaical file which is the calibration from the device itself. During ingest using Capture Manager, this is temporarily converted into a calibration.json file, before appearing in Unreal Engine as a combination of Camera Calibration and Lens File assets. The calibration.json exists only temporarily as a working file and should not appear in Unreal Engine itself.

The Current File referenced in the screenshot (/Game/Developers/JRiggs/FacialRecordings/010_51_B16BCDA44970977D6EDA7AA3F276DA43/Calibration/undefined/calibration.json) suggests that this temporary file is being ingested. A potential reason for this could be that the working directory being used by Capture Manager is under the Unreal Engine project structure, or the file is being manually ingested (using drag and drop).

Can you check the Settings in Live Link Hub, looking for the Default Working Directory and Should Clean Working Directory options? If the Default Working Directory is pointing to a location within the Unreal Engine project then it would explain what you are seeing here. And if so, this would be an unexpected configuration as this is temporary data it does not need to persist beyond the ingest process.

Thanks,

Mark.

Great.

During ingest, Capture Manager in Live Link Hub uses the working directory to create temporary, intermediate data before pushing UAssets into the connected Unreal Engine project. What you saw was Unreal Engine detecting this temporary content and attempting to import it based on the JSON file extension alone.

While nothing prevents you from configuring the working directory to be within the project itself, we generally find this to be unnecessary as the content is not required once ingest is complete - checking it into Perforce would bloat the project.

It is perhaps worth noting that the ingested assets (particularly Img Media Source for video and depth data) will also reference a Media directory outside of the project for similar reasons.

Mark.

Hi Mark, all Metahuman plugins including the depth one are enabled except for “MetaHumanRuntime” since its deprecated[Image Removed]

Windows 11 Enterprise 23H2. This has not worked previously and is the only time I have tried it so far, so yes, trying to ingest iPhone depth data for the first time. I cleared both output logs before running the ingest. Live Link Hub has only 2 lines

LogCaptureManagerPipeline: Display: Data conversion pipeline completed
LogUploader: Display: Connected to the client: D0FB2E534C43FB1F134156B36B4F4789, export IP address: 10.127.220.99:56509

During the re import in editor it pops up the dialog I posted about before, here are the options

[Image Removed]I have to try to set something (it will error out) but if I just cancel it cancels the whole ingest.

Here is the editor output log relevant section

LogFactory: FactoryCreateFile: DataTable with ReimportDataTableFactory (0 1 D:/Project_Dev/REDACTED/Content/Developers/JRiggs/FacialRecordings/010_51_B16BCDA44970977D6EDA7AA3F276DA43/Calibration/undefined/calibration.json)
LogSlate: Window 'DataTable Options' being destroyed
LogCSVImportFactory: Imported DataTable 'calibration' - 1 Problems
LogCSVImportFactory: 0:Failed to parse the JSON data. Error: 
LogSlate: Window 'Message' being destroyed

I attached the full log (cleared right before clicking import)

I was looking in the plugins and there seems to be a couple camera calibration plugins that are not enabled, possibly one of those is responsible for there being a calibration data asset?

Ok that explains it, the working directory is in the project dir. I was using Live Link Face connected to an iPhone with videos to get the data.