AR facial animation to full body?

Anyone made this? I can make the AR Face sample project work but only as a floating head. I don’t even know how to start applying that to full body. Do I have to make a body without a head and attach them together?

Thanks

This is Perception Neuron realtime animation + iPhoneX, driving the fuill character in realtime inside UE4, facial rig is blend shape based of course.

Use the layered blend per bone to mix both animation data.

Also, if you don’t get a reply on a previous post, do not open another one the next day.

In your example do you have a separate mesh and skeleton for the face and the body? If not then how exactly can you get the blend shapes from the face? I would assume that you’d make the blends using the same mesh, since you’d want to keep the same bone structure?

Unique body, but you can simply have them as separated skeletal mesh using the same skeleton, so that you can share the same AnimBP.

Ok thanks, you’ve been very helpful so far :slight_smile: also, sorry for my english (not native).

I’m still struggling to understand how to get the iPhone facial data into a full body mesh though. I’m experimenting with the full body mesh from “a boy and his kite”, just imported that into the Face AR Sample, and I can’t get it to work, since apparently the skeletons are different.

The full body mesh comes with all these blend shapes already integrated, but somehow when I use retargeting everything breaks, and the blend shapes aren’t used, only the joints data.

When I open the pose asset from the head mesh and play with the sliders I can see that there is a merge between the blend shapes and the joints data even though I can’t find a way to replicate this. I tried creating a new pose asset, but could only get the joint animation to work. Is there something I’m missing here?

I used retargeting from the boy and his kite AnimBP to my UE4 skeleton base characters, everything worked without issues.

Alternatively you can just create as many floats as the blend shapes in the boy and his kite AnimBP, then get those values ( which are basically 0 to 1 values ) and apply those to your character, so you’re just “borrowing” the data from that character, applying the facial animation onto your character.

I gave a detailed explanation in this post.

I see.
Quick question: when you mention the boy and his kite are you referring to the full body mesh or the head mesh available in the AR Sample?

The head with the blend shapes, so the one in the FaceAR sample, all the data and the retargeting are obtained from that mesh.