I did an experiment yesterday to see if I could essentially “fake” a Live Link calibration take for Metahuman Animator. I generated the images using AI based on lots of reference photos (with the consent of the real person), and made a forward-looking frame, a left-looking frame, a right-looking frame, and a forward “teeth-bearing” frame. I brought these into Premiere and rendered out a Quicktime ProRes 442 video which held each shot for a few seconds. My hope was that I could use this as a replacement for a “real” Live Link calibration take, and trick the system into using the images (which are very lifelike). I could get the footage to play as a capture source, and in the Metahuman Identity I could even select it. However, the video wouldn’t appear in the timeline to allow me to create markers and promote frames. I wonder if anyone has done anything like this successfully. I’ve tried other methods, such as the Chat Avatar method to create a mesh, as well as the Keen Blender add-on. Both worked okay (for the Mesh to Metahuman workflow), but not as well as I had hoped. Is there a way to actually do what I tried to do in the first place? Is there something in the Live Link take payload that has to be in the video file as metadata? Many thanks if you happen to know!
1 Like