hello ,i am learning unreal metahuman,Many videos on YouTube use live link face to capture facial key points to drive metahuman expressions. Can we directly drive metahuman facial expressions through python or c++? Because we want to use AI to drive metahuman’s facial expressions
Sure, here’s a boring python example (please excuse the incorrect indentation – for some reason indentation worked in the second block of markup code but not the first):
import unreal
def SetLessZombieFace(levelSequence, faceRig, frameNumber):
unreal.ControlRigSequencerLibrary.set_local_control_rig_float(levelSequence, faceRig, "CTRL_L_eye_eyelidU", frameNumber, -0.4)
unreal.ControlRigSequencerLibrary.set_local_control_rig_float(levelSequence, faceRig, "CTRL_R_eye_eyelidU", frameNumber, -0.4)
unreal.ControlRigSequencerLibrary.set_local_control_rig_float(levelSequence, faceRig, "CTRL_L_eye_faceScrunch", frameNumber, 0.4)
unreal.ControlRigSequencerLibrary.set_local_control_rig_float(levelSequence, faceRig, "CTRL_R_eye_faceScrunch", frameNumber, 0.4)
return
where the calling code might look like this:
levelSequence = unreal.LevelSequenceEditorBlueprintLibrary.get_current_level_sequence()
# Get face rigs
faceRigs = []
rigProxies = unreal.ControlRigSequencerLibrary.get_control_rigs(levelSequence)
for rigProxy in rigProxies:
rig = rigProxy.control_rig
if rig.get_name() == 'Face_ControlBoard_CtrlRig':
print("Face found")
faceRigs.append(rig)
frameNumber = unreal.FrameNumber(0)
SetLessZombieFace(levelSequence, faceRigs[0], frameNumber)
thanks