AI lip sync to Metahuman short film

Hello everyone, how’s it going? :slight_smile:
I just created this scene to test the NVIDIA Omniverse Audio2Face to Unreal Engine Metahuman.
I recorded my voice singing this music, cleaned the audio a little bit and opened that in Audio2Face.
AI generates lip sync from a voice over to a 3d face.
The objective here was to show the raw lip sync without any cleanup on the keyframes. But I manually animated the eyes, brows and neck in Unreal Engine to add more realism for the scene. I also did the lighting.

Rendered in Unreal Engine 4.27.1 with the new RTX Global Illumination by NVIDIA.
Color grading in Adobe Premiere Pro.
Music: Bad Moon Rising by Mourning Ritual (feat. Peter Dreimanis).

I love to test new tools with UE.
Feel free to follow my work:
https://www.artstation.com/andersonrohr
https://www.instagram.com/anderson_rohr

Thanks! :blush:

2 Likes