Opinions on the Oculus Lipsync Plugin

I am testing the OVRLipsync plugin from Oculus, and was wondering about some opinions on it. I have linked a test video, using only visemes without any expression. Would this be acceptable quality to use in a game? It seems to be of similar quality to the older Bethesda RPG lip sync methods. The main bonus is it is procedural and could be batch processed across unlimited .wav files, and blended with other expressions.…ipsync-unreal/

Well since you can plug it into UE4 it’s a good start. Using Genesis 8 though is even better as it does have support for both morph and cluster shaping and although there are some interesting options Genesis has the resources for at least testing such options with out having to investing a lot of time.

I’ve been playing with Genesis 3 and Motionbuilder with good results as a procedural with good results.

As for game use not there yet but something that I found using Genesis is to get a good result you will need to over extend the shape as a percentage to get the correct definition so more important than the ability of the final output is the ability to edit the output of the individual shapes.

I do suspect though the Epic is also working on such a feature with the introduction of the control rig feature.

Is this an Epic supported plugin?

Another example

I lowered the Genesis 8 max viseme percentages (ie 50% max “AA” slider) before export and tried different viseme percentage combinations, since the Oculus library is pre-built and uneditable. I guess the plugin’s ue4 wrapper code could be edited instead. I also edited Genesis8 by hiding all bones but neck, then renaming it to neck_01. Then just merging it in with ue4 default skeleton in Maya, so it would just have the default ue4 skeleton.

Daz Studio is way more robust than I thought, although the Quaternion skinning is not supported in regular UE4, idk how important that is. But Daz with Oculus plugin is the only pre-built fully procedural method I have found. I cannot imagine how to do it with 100+ NPCs otherwise, besides building something from the ground up.

Well Oculus is an interesting possibility but is a case of Unreal 4 and Epic playing catch up as to what has been available to the DCC pipeline for years now and the fact that it can be done in UE4 is interesting as far as what Epic has in mind as to key feature additions goes.

Thing is getting UE4 to do something is not really the hard part but rather standardizing the base framework is where Genesis is. Granted it has it’s issues, what off the shelf solution doesn’t, but from a top down development pathway you have to start with a base that is the same under any condition and then introduce the ability to change the output to the desired result.

This is another reason the addition of the control rig feature is so interesting as I’m assuming a much more usable feature as to say procedural driven facial animations will one day show up.

So starting with Genesis 8 is a good starting point as far as base requirements goes as you have discovered that it is robust enough to get the kind of result that you are looking for and UE4 is now flexible enough to make use of external and more advanced applications even if they are external to UE4.

An example

The animations was done in MMD and exported to Daz Studio and then retarget to Genesis 3 all back in 2014. Fun thing is kids are doing this kind of stuff because it’s free and fun :smiley: