Oculus OVR Lipsync Plugin

Has anyone used the free Oculus plugin for lipsync in UE4? I came across this and it seems to be a complete solution for lipsync if the proper morph targets are setup on the mesh. It uses 15 visemes and is fully automatic, on the fly, or with pre-recorded wavs. Looks similar to Fallout 3/4’s lipsync…
I think it would be easy to map custom visemes instead of following the built-in ones.

https://developer.oculus.com/documen…ipsync-unreal/

Used it briefly, honestly I wasn’t satisfied at all, and I’m just using the FaceARSample with the iPhoneX to get ( limited ) lips movement while talking, but the expressions while talking makes the entire thing much more realistic.

I tried an older Faceshift build, but that workflow would become much more time consuming. I thought maybe procedurally layering some expression on top of the Oculus plugin would be simpler. I just figured I would get some input on a more procedural approach similar to the Bethesda RPGs,