Well I did try to set up the characterization so that the G3 character matched up with the Epic rig but from the start the Epic rig is not ideal as to what would be considered normalized proportions. The hands and arms are oversized and the shoulder joints are located higher than they would be on your average human figure. Overall not the ideal candidate for a 1-1 re-target even though I have come close to setting up the base pose.
The G3 framework of course starts with a normalized rig closer to the average 6 foot high character and configured as to normal ratios to the humanoid form. The pelvis for example is parented to the hip joint which creates a separation of transform and rotation unusually available in more advanced rigging configuration. The only current negative is the twist bones are parented to join bends so if you plan on using IK joints is going to be a bit of a problem that in my opinion is a bug type problem to be corrected in UE4 hopefully in a future release. (can be corrected before adding animations or as part of retargeting)
That said no mater what is being retarged one can only expect getting close with a retarget that in general needs to be cleaned as in any other kind of mocap type conversion but using tools like MotionBuilder it’s a much easier task using tools in MB that is designed to make the require work easier. Long way of saying anything involving animation data is why the wourld needs animators in the first place.
A trick I do use though as far as the use of animation data for video games is to retarget for contact points rather than full body animations by importing the 6ft3inc Epic base animations, correct the scale of the mannequin to the 6 foot G3 model and add auxiliary effectors linked to the contact points of the Epic rig. The result is a mirror of the riquire animations, as in the weapons pointing in the right direction, but excludes the hunched shoulders as what usually occurs with a direct retarget.
Lip Syncing is a subject onto it’s self and sure like most things in Daz Studio in app lip syncing is possible but like most solutions is not what I would consider AAA quality. Using cluster shaping expressions one can output expressions just like any other animation data and of course there are some tools in UE4 that helps make lip sync implementation easier but if you are looking for the best of best solution then is a process that still requires authoring.
An example of me messing around with expression clusters exported from DS.
https://www.youtube.com/watch?v=a1RFKxL2Bzg