Jobutsu - ZenBlink For Metahumans

ZenBlink Plugin for Unreal Engine

Bring MetaHuman characters to life with ZenBlink, a powerful plugin for dynamic Eye, Head and Face animation and emotional realism.

Features:

  • Auto Blink

  • Pupil Constriction

  • Eye Animation

  • Basic Emotion based Facial Animation

  • Head Movement

  • Target Actor Following (LookAt)

  • Eye Auto Focus for CineCamera

  • Weight and Blend Controls

  • Available for Key Framing in Sequencer

  • Works with LiveLink (ARKit)

  • Supports Windows,Mac and Linux

Promotional Preview: https://youtu.be/8srr5ahd0hM

Promotional Preview v2: https://youtu.be/2eYTS7vYqoo

Youtube Tutorial: https://www.youtube.com/watch?v=ayqmWZg23pY
Support & Documentation: https://www.zenblink.com

  • Automates blinking, pupil constriction, eye & head movement, Facial Emotion and eye/head target tracking.

  • Choose from 22 predefined Character emotions.

  • Seamlessly integrate with Unreal Engine's Sequencer for cinematic-quality animations.

  • Perfect for Animation Generation, NPC's, Cinematics, Cut Scenes and Simulations.

ZenBlink delivers lifelike character interactions effortlessly.

If you require more comprehensive and tailored support head over to our support site: ZenBlink Support & Documentation Website

Hello,
We are currently working on a VR project and are considering integrating Zenblink into our pipeline.
Would it be possible to track the gaze of Metahumans and relate it to the position of the player’s face, instead of using a camera as the reference point?

In your video, you mention: “Be aware that if you use Zenblink head movement, it will override any other animation in your sequence.”
Does this also apply to full body animations and facial animations (like mouth movement and expressions), or is it limited to head movements only?

Best regards.

1 Like

For tracking you can either attach/locate a target actor to the other characters face or create a custom proximity setup to make your metahuman “look at” any actor you want.

There are blend controls for head and facial animation BUT ZenBlink controls neck_01 up only. Body animation is handled by the user.
Care has been taken to ensure ZenBlink works with existing lip-sync and facial expressions.

1 Like

Hello Jobutsu,
I´m using neurosync to generate facial animation in realtime. Does Zenblink work with livelink in a realtime application?

1 Like

Yes, ZenBlink is a post process and should work in realtime in most situations.
I have never heard of neurosync, so can not definitely say it will work but if it uses standard livelink facial functionality/curves, I don’t see why it wouldn’t.

I hope this is helpful.

Hi, I’ve recently purchased ZenBlink. I wanted to check—does it include all the tools from ZenDyn, or would I need to buy both plugins? If both are needed, do they work together seamlessly?

I’m mainly using Unreal Engine for in-game cinematics, relying on UE’s audio-to-lip-sync feature along with premade animations. I bake these to Control Rig and then make manual adjustments. Based on this workflow, how can your plugins help enhance or speed up the process?

Also, is there a Discord or any community/forum where I can ask questions and get support?

Thanks!

Hi, I presume you are the person I am dealing with via the support ticket system ?

For anyone else looking in:
ZenDyn is ONLY for detailed control over facial animation curves.
it also has basic dynamics tools that generate random facial movement.
Its target user base is animators that require more control over live/baked facial animation.

ZenBlink lacks those detailed facial curve controls/dynamics and is aimed at people like myself that require an all round tool that has features that help with Metahuman eye/head performance.

Both plugins share:
Emotion poses
Micro saccadic eye movement.
Auto Blinking (with pupil dilation)

I hope this is helpful.