Animate MetaHumans & WIN ! - 2021 iClone Lip Sync Animation Contest

Reallusion, the developers of iClone and Character Creator, are hosting the 2021 iClone Lip Sync Animation Contest from July 1st to September 30th. We are inviting all Unreal animators to enter with a 30 second lip sync animation using the latest iClone facial animations tools and their favorite MetaHumans.

Unreal animators can submit up to 3 entries for a chance to win over $10,000 USD cash, and over $30,000 in prizes! Including UE Marketplace credits sponsored by Epic Games!

Participants can request up to 2 months of free iClone software, including all the Unreal Live Link plugins to get started. :raised_hands:

Visit contest:

Reallusion invites creators from around the world to enter the 2021 #iClone Lip Sync Animation Contest for a chance to win $40,000 in cash and prizes!

Competitors can submit a 30-second facial animation video made with iClone 7.9, recreating their favorite movie scenes, music videos, short stories, monologues or just about anything they like! New animation advancements in iClone face and lip-sync solutions work with #CharacterCreator or #MetaHuman characters, allowing entrants to utilize digital humans from either platform. #reallusion

Special thanks to our sponsors::muscle:
NVIDIA , Epic Games, Pixologic , Marvelous Designer , Boris FX , FXhome , Noitom , Rokoko , Razer

Animating MetaHumans has never been easier as now users can take advantage of iClone’s updated LIVE LINK plugin for Unreal Engine to bring life without breaking the immersion.

Unreal animators can now easily make MetaHumans talk, emote, and perform as they wish as the iClone MetaHuman Live Link gives all the answers by providing designers with a highly efficient way to animate Metahumans, face and body, or even voice lip-sync; all in one application.

As an existing user of IClone , just been testing this workflow using MetaHuman Live Link, acculip sync, mocap, and adjustments, it does work well and is quite a time saver. Here is some WIP testing with a MetaHuman, will try to refine and push it a bit further.
https://www.youtube.com/watch?v=lFu0RwmoIsA

1 Like

In this tutorial, you will be introduced to a very brief summary of some of #iClone’s most powerful motion tools, and get a glance at how you can use them to animate your #MetaHuman characters in #UnrealEngine. We’ll also go through the initial setup steps in your Unreal project to enable animation with iClone #LiveLink.

The team at the Formation Animation, share with us their behind the scenes on how they easily animated their MetaHuman contest entry with the use of iClone and Character Creator!

Download the new iClone MetaHuman LIVE LINK kit, for free!
image

Continuing with our support for MetaHumans, in this tutorial, we’ll go over the basics of how to animate your MetaHuman character’s face in Unreal using iClone motion tools with the iClone Live Link plug-in.

Inside you will also find a few tips on how to optimize the connection between iClone and Unreal for smoother operation. :+1:

In this video, you’ll learn about the basic steps that you need to take in order to animate the body of your MetaHuman characters in Unreal using iClone Live Link. We start off with the naming conventions, and move into the custom modifications that you need to make to your character’s blueprint in order to get the animation correct.

Finally, you’ll learn about a couple of the basic #iClone tools for body animation.

Hello there,
Is there some tutorial on how to import FBX (face and body animation), from IClone to a Metahuman character?

Hello John,

At the moment, there is no tutorial because iClone characters are not fully compatible with MetaHuman meshes. That is why we use this trick of puppeteering MetaHumans with an iClone dummy.

In this #iClone #MetaHuman tutorial, we will cover how you can stream both body and facial animation simultaneously over Live Link into #UnrealEngine for full body motion performance. :man_teacher:

You’ll also learn about how you can record the animation on your MetaHuman character in Unreal using the Take Recorder, as well as how to fix an issue that currently exists when animating MetaHuman characters with Live Link.

Thank you, I already went through the tutorial, and all works great until I try to record. Even optimizing both programs’ performance, the recording is not clean. I guess I will have to stick to an IClone character for the animation if there are no other solutions. Last thing, is there a way to have motion blur in IClone? Hair in Unreal look quite bad with dithering and MB is the only thing I miss in IClone…
For those with a similar problem Adding Motion Blur in the FREE Version of Davinci Resolve | NO 3RD PARTY PLUGINS!!! - YouTube you can do it in post with DaVinci Resolve

Hello John,

Would you explain more about “the recording is not clean”? Kindly provide us a video recording about this issue, and email it to marketing@reallusion.com and titled “User Feedback - Unreal Engine Forum”. Once being able to identidy your questions, our team will get back to your and reply. Thanks!

Hello,
The recorded sequence (in Unreal) has missing frames and inconsistent speed. I will follow up with your marketing team.
Thank you!

Hello guys…
Thanks to Reallusion and Epic games for this, really good thing for noobs like me, it’s a huge opportunity and I started learning Iclone7 and CC3 already, downloaded the trial version few days ago…

@JohnDoe0 - Thank you so much for sharing your feedback John, we are working hard to make the iClone Unreal Pipeline more powerful and easier for everyone! :raised_hands:

@Borisle3eme - We really appreciate your trust and support Borisle3eme! Every month we are cranking out more tutorials, while innovating on more features for all iClone-Unreal Engine users. Follow us here as we will regularly post more easy to follow tutorials. :+1:

Longtime #iClone user - Solomon Jagwe shares his behind the scenes and tutorial on animating his #Metahuman “I have a dream” speech with iClone Acculips and Unreal LIVE LINK.

In this Tutorial Solomon shows how he created a custom MetaHuman, which he used to narrate a part of the “I have a Dream” speech by Dr. Martin Luther King Jr. The MetaHuman facial animation is driven by an iClone character using Acculips + Live Face and an iPhone X, via the iClone Unreal Live Link to the Unreal Engine.

Solomon continues with his iClone-MetaHuman tutorials. In this video he walks us through how to add fullbody animations to MetaHuman children, created with the MetaHuman creator for the Unreal Engine, using #iClone Live link and ActorCore mocap.

“This is such a time saver and makes it possible to animate multiple MetaHumans at the same time.” - Solomon Jagwe

MetaHuman Facial Animation - 1-Minute Tips to skillfully animate MetaHumans

MetaHuman Animation - Tip #1 - Creating Accurate and Smooth Lip-Syncs with iClone AccuLips :smiley: