Jobutsu - ZenBlink For Metahumans

ZenBlink Plugin for Unreal Engine

Bring MetaHuman characters to life with ZenBlink, a powerful plugin for dynamic Eye, Head and Face animation and emotional realism.

Features:

  • Auto Blink

  • Pupil Constriction

  • Eye Animation

  • Basic Emotion based Facial Animation

  • Head Movement

  • Target Actor Following (LookAt)

  • Eye Auto Focus for CineCamera

  • Weight and Blend Controls

  • Available for Key Framing in Sequencer

  • Works with LiveLink (ARKit)

  • Supports Windows,Mac and Linux

Promotional Preview: https://youtu.be/8srr5ahd0hM

Promotional Preview v2: https://youtu.be/2eYTS7vYqoo

Youtube Tutorial: https://www.youtube.com/watch?v=ayqmWZg23pY
Support & Documentation: https://www.zenblink.com

  • Automates blinking, pupil constriction, eye & head movement, Facial Emotion and eye/head target tracking.

  • Choose from 22 predefined Character emotions.

  • Seamlessly integrate with Unreal Engine's Sequencer for cinematic-quality animations.

  • Perfect for Animation Generation, NPC's, Cinematics, Cut Scenes and Simulations.

ZenBlink delivers lifelike character interactions effortlessly.

If you require more comprehensive and tailored support head over to our support site: ZenBlink Support & Documentation Website

1 Like

Hello,
We are currently working on a VR project and are considering integrating Zenblink into our pipeline.
Would it be possible to track the gaze of Metahumans and relate it to the position of the player’s face, instead of using a camera as the reference point?

In your video, you mention: “Be aware that if you use Zenblink head movement, it will override any other animation in your sequence.”
Does this also apply to full body animations and facial animations (like mouth movement and expressions), or is it limited to head movements only?

Best regards.

1 Like

For tracking you can either attach/locate a target actor to the other characters face or create a custom proximity setup to make your metahuman “look at” any actor you want.

There are blend controls for head and facial animation BUT ZenBlink controls neck_01 up only. Body animation is handled by the user.
Care has been taken to ensure ZenBlink works with existing lip-sync and facial expressions.

1 Like

Hello Jobutsu,
I´m using neurosync to generate facial animation in realtime. Does Zenblink work with livelink in a realtime application?

1 Like

Yes, ZenBlink is a post process and should work in realtime in most situations.
I have never heard of neurosync, so can not definitely say it will work but if it uses standard livelink facial functionality/curves, I don’t see why it wouldn’t.

I hope this is helpful.

Hi, I’ve recently purchased ZenBlink. I wanted to check—does it include all the tools from ZenDyn, or would I need to buy both plugins? If both are needed, do they work together seamlessly?

I’m mainly using Unreal Engine for in-game cinematics, relying on UE’s audio-to-lip-sync feature along with premade animations. I bake these to Control Rig and then make manual adjustments. Based on this workflow, how can your plugins help enhance or speed up the process?

Also, is there a Discord or any community/forum where I can ask questions and get support?

Thanks!

Hi, I presume you are the person I am dealing with via the support ticket system ?

For anyone else looking in:
ZenDyn is ONLY for detailed control over facial animation curves.
it also has basic dynamics tools that generate random facial movement.
Its target user base is animators that require more control over live/baked facial animation.

ZenBlink lacks those detailed facial curve controls/dynamics and is aimed at people like myself that require an all round tool that has features that help with Metahuman eye/head performance.

Both plugins share:
Emotion poses
Micro saccadic eye movement.
Auto Blinking (with pupil dilation)

I hope this is helpful.

1 Like

Hi Jobutsu,

one question: is it possible to use it on the Quest 3 Headset as a standalone build (Android)?

Thanks!

1 Like

First sorry for delay, I am not receiving forum notifications.

I can’t answer your question because I have no experience in the subject.
ZenBlink is only tested for Windows,Mac and Linux for now.

Hello,

I just want to thank you for this amazing plugin. I rated it 5 stars, by the way.

I recently started learning Unreal Engine to build my own game, and I’ve been trying hard to make it feel as realistic as possible. Unfortunately, I was never satisfied with my character, it felt lifeless, and I couldn’t create proper sequences because I don’t have the animation skills for MetaHumans. I spent days and weeks trying everything without results, to the point where I almost gave up and even started looking for people to hire to do what I needed.

Then, by pure chance, I found your plugin on Fab. I read the details and purchased it. After installing, I had exactly what I needed in under 5 minutes!

Thank you from the bottom of my heart, you truly made my character come to life, and in such a simple way. This plugin deserves 10 stars!

Best regards,
Rami

1 Like

Thanks Rami, If you have problems either now or with a future version, don’t hesitate to get in touch.

1 Like

I am currently making a game and I have a metahuman setup with facial animation for dialogues. Will zenblink interfere with any of that? I use an animation montage for my lipsyncs. such an amazing asset by the way, Im just worried that this is mostly usable in a sequencer, because my project wont be having much of that. I mainly just want the NPC metahumans to look at the player camera.

1 Like

We talked about his elsewhere last week correct ?

Hi Jobutsu!

I purchased ZenBlink and find it exceptional - it added a level of authenticity to my UE-based facial mocap + marketplace body animation workflow!

I’m doing a talking-head cinematic with a seated character who looks at a moving camera via Target to Follow. The only issue I’m noticing after some extensive use is that having the head and eyes being locked to the camera throughout a long duration cut begins to look unnatural.

If I reduce the Head Movement Blend then the head begins to move more naturally along with the body animation, but the eyes no longer look at the camera.

For talking-head/interview-style videos like mine - where the performer is addressing the viewer/camera directly - this feels very unnatural.

Is it possible to lock the eyes on the camera (maintaining their saccadic movement) while allowing the rest of the head movement to blend with the body animation?

I’ve reviewed the settings and tutorial videos but I don’t see a way to do this - is this possible?

I attempted to keyframe Eye Aim Adjust, but trying to account for each movement of the gesture-heavy animations is onerous and difficult to get right - especially for long sections of dialog.

Thanks for your great plugin - it’s really leveled up the quality of my work!

Cheers!

1 Like

In the next version, I have disconnected the eyes from the head in the targeting logic.

This should resolve the issue.

I currently have an issue that is delaying release of the update, once resolved I will publish.

Amazing - good luck with the issue! And Kudos for being so responsive - I know it’s a lot of work - your effort is greatly appreciated!

1 Like