Best way to create dialogue(talking) animations using only Blender/Unreal?

Interesting. I’ll take a closer look at the G3 setup. By exporting the rigging data, are you talking about exporting a body/armature and using that to copy bone weights/morphs to a new, custom body asset? Or is there a more direct way to do this? IE: importing my own custom mesh, rigging it inside of Genesis and exporting the new, rigged mesh with the armature out?

I wonder about exporting animations with Genesis. Is that possible as well?

BTW, to clear up another point, the reason I was wanting to stick to using the Unreal armature is for possible use with a plugin called Allright Rig. It allows you to take any mesh that has been rigged with the unreal armature so that you can animate it inside the game engine. I’ve never been a fan of Blender’s animation system unfortunately. Maybe I just need to experiment with it a bit more.

The other question, is, what would be the best way to get animations in-game? The three ways I can think of are as follows:

  1. Create the animations inside Blender or Genesis along side a sound file, export both to unreal and make a dialogue blueprint to that plays the sound file with the animation for each dialogue blueprint?
  2. Create different animations for each mouth movement and make a master blueprint that drives the movement via sound drivers?
    3.Create different animations for mouth movement, then use matinee to combine them and use them inside of dialogue blueprints?

Beyond that, I assume I probably need a master blueprint of sorts to handle emotion states and to allow an npc to track the player or another npc with his/her eyes and head.