Metahuman For Maya: Metahuman Groom Exporter - No Way to Export Animated Groom Cache?

Hello, I’ve been using the Metahuman Groom Exporter in Maya 2025, no issues really after figuring out how to get it working. However, it seems as though it is impossible to export an animated groom cache after simulating your groom; this is not ideal for cinematic rendering. I have tried swapping the original xgGuides for animated ones based on the simulation, but there is no way to export these ‘xgGuides’ from XGen, and especially not in a way that causes UE to read them as a groom asset. I have also tried exporting the simulated curves as an alembic file, but these do not read as a ‘groom asset’ in UE and therefore cannot be used as guides to drive the groom’s animation. I have even tried editing the .py files that come with the groom exporter, as to have it read the entire frame range I need, but it still exports a singular frame of the groom, even after reading all the other frames.

The closest I got was: Converting the groom to XGen Interactive and using the .abc file with the simulated curves to drive a linear wire modifier. Then exporting those wires as an alembic file. Sadly, this still does not work as Unreal reads the *guides* as the strands and creates fake guides for said strands. So the strand count and vertex count matches with the static groom’s guide count and vertex count, but because they are being read as strands instead of their rightful status as guides, the animated groom cache is “incompatible with the static groom.” I will attach a couple images of this import failure incase my explanation was not clear.

groom01

zzCvMf7

The only way I’ve been able to get it to read is by exporting a cache of the animated groom description in its ‘interactive’ state. This is not a fix as it is still not compatible with the groom created by the Metahuman Groom Exporter, it is now just its own groom with additional animation data. Secondly, this results in files upwards of 15gb for 100 frames of data in any groom that is dense/complex, as it is reading every strand on every frame. This is already not really sustainable on its own, but also extra extra difficult to work with in Unreal due to the file size.

Is there something I am missing, or is this truly impossible?