I have a level populated with NPCs. The Player has to go around the level eavesdropping on conversations. What are the methods to get their lips moving properly.
For what it’s worth, I’m currently using “Mixamo Fuse”-generated characters, if that helps.
Do the characters have a jaw bone? If not, you should go back to your 3D package and add one. I’m not sure exactly how to do it in Unreal, but you’ll need to read the volume of the speech audio files and rotate the jaw bone accordingly.
This of course does not look realistic. If you need realism, you’ll have to go back to your 3D package and animate them by hand or use a plugin like Voice-O-Matic to analyze speech audio files and blend between mouth shapes automatically, though you’ll still need to set up these mouth shapes yourself.
But do they have a Voice-O-Matic plugin? This lip synching stuff seems far more work intensive than anything else. That’s it! I’ll get everyone to wear helmets!
S’okay. I’ll ask Epic if they’ll give me the money for a 3DS Max sub. They have $5m burning a hole in their pocket, I hear.
Hello guys.I managed to make a lip sync with Papagayo in blender but I cant find a way to import the animation data to Unreal.
All it does is import the shape keys I created, not the keyframed data that Papagayo created in Blender.Is there some option I have to choose in the export?