Ask Unreal Anything: Animation | July 13, 2022 at 11AM EDT

I am just learning about the MetaHuman Creator.
On some items of my character, the shirt for example, there is the option to have a character that does not have a shirt. For this I can just select “None”.
This option is not there for the shoes.
Is there an alternative way to remove the shoes, as happens for other items when you select “None”?

Animation blueprint seems to be incredibly powerful to reduce what is actively being calculated at any given time. There is a sneek peek into how this was done in Fortnite on the supporting document page but I was curious if there were any further plans to do an Unreal Talk on the subject? Common things I’m running into that I would love more detail in practice on:

  • How do we smoothly blend between linking and unlinking pose data via piping in previous playing linked data and base level animation graph?
  • Can we pass data variables like speed/actor rotation etc from the main graph to the linked graph or is the intended workflow to reset up common casting calls in each graph?

Thanks!


I have the animation Sequence assets C-1 and C-2 which require to be played.
When hero uses skill, Idle and skill animation sequence assets are blended by “Blend list by enum” in “Skill State”.

Genernally when skill is used for the first time, C-1 is playing. And for the second time, C-2 is playing.
But if i use skill right after using skill for the first time, the interval of C-1 and C-2 is too short so that the short time of front animation of C-2 is skipped and played.

The problem is solved by using ResetChildOnActivation option, however i just wonder how and why this situation is occured.

Will it ever be possible to select inside of the rotation gizmo and rotate in any direction? Similar to Maya. The current rotation gizmo is clunky for key frame animation.

1 Like

How would one go about animating a clear sphere that has character animations inside of it?

Hi there!

Now with modern GPUs having matrix accelerators like tensor and XMX cores, how will UE5 integrate real time machine learning for animation and physics in the near future?

In the Unreal docs it is noted that both IK Rig and Control Rig can be procedurally affected in animation blueprints. What are the ideal use cases for using one over the other to achieve Full Body IK adjustments to animations at runtime, if I am not intending to ever manually animate on the Control Rig? Can IK Rig be used for things like the Valley of the Ancients slope warping just as well as Control Rig was?

Thanks!

1 Like

I have noticed that we can achieve similar results using IK Rig or Control rig, but I was wondering what one is faster to process at runtime.

When two characters with different sizes use the same control rig, if in the setup event I snap the Head controller transform to the head bone transform, then in-game the head bone location is set different depending on what preview skeletal mesh is used in the Control Rig editor.

For example, if in the Control rig preview I select a tall character, then in-game, the short character´s head is stretched to the tall character´s head location.
Basically, retargeting doesn´t seem to work if I pass location and I was wondering what the best way is to share control rigs across multiple size characters

We have followed the Matrix and Ancient Valley approach of having characters and animations facing Y+, and then rotating the character mesh component in the character blueprint towards X+ which is game forward.
This seems to be a good solution, until we have discovered that for root motion to work, animations have to be facing X+ in the first frame. So now we have characters facing Y+ and animations X+, and when they spawn, we can see a one frame glitch rotation of 90 degrees.

I think that now we have to rotate characters 90 degrees towards X+ but I was wondering, first why we have this contradiction, and second if rotating characters 90 degrees on import is the best way to fix it.

In control rig development, are there any plans for further QOL UI features for control rig?

For example it would be nice to have a Blender style Parent: bone/control selection dropdown in the details panel so you don’t have to drag everything around in the hierarchy.

Also curious if the animation team have anything they would like to have in control rig that’s not there right now.

  1. is it planned to get pose assets into control rig? would make it easy for some things to just have a few pose assets and then animate curves to blend between them. and have those poses update rig positions.

  2. for animation modification, in sequencer, once I’ve added an animation track, i can add the auto FK rig and set it to additive. with no baking, and it’s really clean and awesome. but for a custom control rig, I have to bake to it to the rig before i can modify things, is there a way I don’t know about to add a single control rig as an additive to an animation track without baking first?

  3. for sequencer animation authoring. if I want to build different rigs for different interaction models, vs 1 complex rig, is there a way to bake to a sequencer without exporting it to a file? or is the Dev thought process to just make a more complex single rig for in engine animation authoring.

This is less technical, and more of a big-Epic question:

Are there plans to partner with an existing mocap company or provide access to a large mocap animation library (similar to materials and 3D objects in MegaScans)? Thanks!

Is there any current training that you suggest that can walk someone who is not an animator, but may wear multiple hats on a project, an in-depth guide on how to take a MetaHuman character and create basic locomotion animations using Control Rig? Is this even possible with Control Rig only?

1 Like

Am I able to drive a vehicle in ‘active level mode’ and have a separate camera (not in the blueprint) which I can view what’s happening through and then record a live take? e.g. A shot camera? Thanks!

Is there any plans to make the whole process of animating more user friendly specially for artists who are coming from film/animation backgrounds. It’s a combination of several small things that makes it a bit frustrating to animate in UE. I’ll give you some examples:

  • There should be a way to disable LMB to move the camera. There has been many times when I’m trying to select an object I accidently move the camera instead!
  • Free form rotation(three axes at once) is a must. At the moment I can rotate one axis at the time which again makes it slow to move/rotate things around while animating.
  • I have notice even grabbing the center of the translate gizmo can benefit from some improvements. I should not need to click three times in order to be able to move an object in two axes at once!
    -There should be a way to disable this “ease in ease out” in the camera while navigating the viewports; it doesn’t add anything to the whole experience. Just look at the way camera orbits around an object in other DCC packages, very linearly and it works :slight_smile:

You might think these are some minor points, but believe me I have been animating for 20 years in animation and films, and I think it’s the collection of these small changes that would make the different when the animators debating whether to use UE for keyframe animation or not.

Looking forward to hear your thoughts.
Thank you so much!

1 Like

For real-time animation, using in-engine and control rigs have a bit of learning curve. For users preferring animations from DCC apps, the workflow is by FBX Animation export.

In the maya-unreal pipeline, Datasmith and Live Link for unreal engine, plays a crucial role.

can we expect a similar Live Link and datasmith for blender users?

The animation field guide is very useful. Thanks for putting out. It would be great if you could put separate videos demonstrating the real-time animation possibilities from different DCC apps.

For example: There is a webinar on “Animation Workflows Using Unreal Engine and Maya”. Similarly, for blender will be very useful. Thanks.

1 Like

About Animation Sequence Editor : Why I Can’t modify curve type every frame ? Just can set the Interpolation is Linear or Step . In some special cases , it’s important .

About Animation Compress : Change Animation Compress Settings , always make editor crach.but in some long sequence , when character’s location move far away from the start location , it will cause some bad result .

About Physics Animation: I want to discussion about two-side linked chain . What’s best way to do that ? In my case , its a small chain hanging , linked on my weapon , but two side of chain need to link on the gun’s two side , how do I solve this kind of physics animation ? I tried make physics asset and use it in Animation Blueprint by "Rigid Body " node , but the result is not really acceptable , and not stable …

How i must use LyraCameraComponent for True FPS camera, if this component excludes the ability to assign a camera to the head bone? Should I rewrite the hero blueprint to emulate all the default components and use my own camera component instead of Lyra’s camera component? If so, how do I tell Lyra to use my camera component attached to the head bone

Hi :slight_smile:

Is there any support in Unreal’s Python API to create blueprint nodes? (in animation blueprints or any other blueprint).

Thanks,
Vic