Dynamic blending of animation assets at runtime...

I have a crowd of skeletal mesh components created within a Blueprint at runtime.

They all share the same the skeleton and animation assets despite having different meshes.

I can assign a random dance animation with random start offset for a great degree of variety. This is all working fine.

However, what I would really love to be able to do is to blend between a random idle animation (from another array set up in the Blueprint) and a random dance animation.

So similar to a 1D Blendspace but instead of hard coding the animation assets within the Blendspace, I want to be able to blend between a random asset from the two states (idle > dance).

Using an Animation Blueprint I can blend between two Sequence Player pose inputs in the AnimGraph using an exposed float variable. This again works great, but I am still missing the ability to pass a random idle and a random dance animation asset.

I could create a bunch of Blendspaces or Montages but there would be so many combinations, I’d rather not.

I did investigate (with the help of ChatGPT) the use of Animation Layer Interfaces and Animation Layers inside an Animation Blueprint - but frankly the suggestions of the AI did not align with reality and I gave up.

Is this even possible? If so, could anyone point me in the direction of how to implement this?

Thanks in advance.