You can set layers to blend with inertial blending:
Requires an inertialization node in your main animbp
You can set layers to blend with inertial blending:
Requires an inertialization node in your main animbp
Hi guys! I have a few questions for you.
1- When I apply rigid body to my physic assets, I have very wierd and violent motion when I play montages, for example my coat becomes crazy. How can I control the rigid body node when I jump? And how can I reset the physic when it ends in a not default position.
2-How can I start an animation in the middle of the timeline, for example I have a loop of a character climbing, I would like it to start with Right hand and in Left hand in a certain time.
3-I really dont get the idea about conduits
Thanks for this forum
Yes! The team is currently developing Constraints for Sequencer, which we hope to be in the next UE release. This will allow you to interact between other Control Rigs and other objects in the world.
Also in the works is animating multiple Control Rigs at a time, for example, a body and face rig, or the Mannequin and a weapon.
While we’re still actively developing many tools for animators, we’re also ensuring there is Python exposure so the community can also build tools. Here is some documentation on how to get started.
Hello! In regards to Virtual Production, I find a lot of content from Unreal (Docs/Tutorials) geared towards game dev (which makes complete sense), I’m just wondering if Epic has some plans to start a few series geared to those workflows? For instance, the use of the Retargeting doesn’t really show what Epic would do when streaming directly from a Motion Capture system. I would love to be able to remove another program (Motion Builder) from our pipeline and have done this, but it would be great to see this kind of workflow and use of the tools from a Realtime perspective as we aren’t dealing with any animation files.
So yes in short, I’m hoping that some more indepth Realtime series come up or if there are some plans for that?
We definitely plan to have tutorials, documentation and examples with data you can download. More soon!
It should map automatically out-of-the-box. If there’s been a regression we can push out a fix to those assets directly.
We’ll take a look at this ASAP. Thanks!
Yes, we’re continuing to develop and refine Animation Mode. A few of the features in the works are covered in this thread: Ask Unreal Anything: Animation | July 13, 2022 at 11AM EDT - #67 by Jeremiah3D
Thank you!
I can speak to starting an animation in the middle. In the case of montage, just create a section for your position you want to start at, then call Montage_JumpToSection with the section’s name. For regular anim sequences played via animation blueprints, you can expose the Start Position via a pin on the sequence player node and pipe the time value you want into it.
One use case for State Machine Conduits is in providing common transition logic instead of duplicating that logic across multiple transitions, e.g. the transition to the conduit takes care of the ‘common’ (e.g. is moving) case and the transitions out of the conduit can be more specific (e.g. is crawling, is running, is walking).
Great question. We are currently evaluating the workflow for live retargeting using the new IK Retargeter. It’s a bit clunky at the moment, you have to:
This works, but requires a lot of boilerplate setup and runtime access to the retarget parameters is not properly exposed (requiring modifications to the retarget asset to tune the results).
Supporting runtime retargeting for Virtual Production remains a high priority for the IK Retargeting plugin in particular, and the animation team in general.
I was really curious about this still, as I don’t think you can IK the socketing/ transforms.
I found a way to get a reference to the ‘rendered’ mesh and attach things to that, but it feels pretty messy still.
Thanks for the reply! Honestly, this helps my project quite a bit as I have been searching and trying many things to get more Mobu like controls at runtime(Pin hands/Feet ect.) and sounds like this is still in the works!
Not at the moment, the closest thing to it right now is the blueprint-centric documentation, most of which is directly mappable to C++.
We’ve actually gotten this question before, and I’m lazy so I’m going to copy and paste our response from that here as well.
Question: "Root motion import orientation changes extraction behavior
Prior to 5.0 we were able to import animations from Maya that had an arbitrary forward orientation. Then when importing to Unreal and enabling root motion on the animation it would move as it did in Maya, with both mesh and root motion oriented in that arbitrary direction, animation however was correct and looked as it did in Maya.
With 5.0 we see a different behavior, and animations that aren’t facing positive X will not get their root motion extracted properly. We’ll see the actor moving in one direction and the mesh facing in a different direction. So what we’ll see in Maya is inconsistent with what we see in the Editor. It’s not just an axis change, it seems like root motion extraction assumes an orientation somewhere and creates a mismatch between motion data and the mesh orientation.
This can be seen in the animation editor, when selecting Character > Animation > RootMotion > Loop (or Loop and Reset). The behavior is the same when animating a character in game.
Is this an intended change? Is it a known bug?"
Answer: "We are aware of this. It is indeed an ‘issue’ we introduced with a change we made in UE5 to be able to import and use root motion animation offset from the origin, which is something that improves the workflow for multi actor interactions and it wasn’t possible in UE4.
We have discussed this internally and we think this is a content error and the engine should not be dealing with that when extracting root motion. Our plan is to provide a way to fix the orientation of your animation in the engine but there is not ETA for that yet. In the meantime you have two options:
Go back to the DCC and fix the orientation of the mesh in the animation. The mesh in the animation should have the same orientation than the ref pose (e.g if the mesh is facing X axis in the ref pose the mesh in the animation should be also facing x axis).
If you can modify the engine locally and you don’t need to be able to import root motion animation offset from the origin you can replace UAnimSequence::ExtractRootMotionFromRange with code below which is what we had in UE4."
FTransform UAnimSequence::ExtractRootMotionFromRange(float StartTrackPosition, float EndTrackPosition) const
{
const FVector DefaultScale(1.f);
FTransform InitialTransform = ExtractRootTrackTransform(0.f, NULL);
FTransform StartTransform = ExtractRootTrackTransform(StartTrackPosition, NULL);
FTransform EndTransform = ExtractRootTrackTransform(EndTrackPosition, NULL);
// Use old calculation if needed.
if (bUseNormalizedRootMotionScale)
{
//Clear scale as it will muck up GetRelativeTransform
StartTransform.SetScale3D(FVector(1.f));
EndTransform.SetScale3D(FVector(1.f));
}
else
{
if (IsValidAdditive())
{
StartTransform.SetScale3D(StartTransform.GetScale3D() + DefaultScale);
EndTransform.SetScale3D(EndTransform.GetScale3D() + DefaultScale);
}
}
// Transform to Component Space Rotation (inverse root transform from first frame)
const FTransform RootToComponentRot = FTransform(InitialTransform.GetRotation().Inverse());
StartTransform = RootToComponentRot * StartTransform;
EndTransform = RootToComponentRot * EndTransform;
return EndTransform.GetRelativeTransform(StartTransform);
}
Follow Up Question: “Thanks. Could you explain a bit more about how it’s useful for multi actor interactions to have root motion offset from the origin? What exactly is the difference in the motion extraction approaches?”
Answer: Hopefully I get some time before 5.1 release to fix it.
Unfortunately we don’t have an example of how to use it. So it may be a bit difficult to understand for non technical people. It basically ask you for a rotation that you want to apply to bring the mesh to face the same axis the ref pose is facing. Unfortunately I don’t think there is a way to do that automatically (or at least I can’t find one) "For multi actor interactions you need to know the location of each actor relative to the other at any point in the animation to ensure they are perfectly aligned during the interaction. Same thing for IK, you usually need to know things like where the hand of char A should be relative to the head of char B to fix contact points when playing the interaction on uneven terrains for example.
In UE4 when extracting root motion we were counter rotating the root transform with the rotation in the first frame (as you can see in the function above). That was done to deal with the problem that you described, but when doing that now you are not able to have root motion animations offset from the origin (root motion will move the actor in the wrong direction, similar to what you are seeing now). When working on interactions the animator will animate the interaction in the DCC with each actor perfectly aligned relative to the other but after importing the animation they had to reset the position/rotation of each actor back to the origin, so root motion can be applied fine. This prevented us from simply extracting the bone transform of each animation at the same time to know where one actor should be relative to the other during the interaction so now the animator will have to create an extra bone in the skeleton that represents that information, this could cause a few problems specially if this bone wasn’t considered at the beginning of the project. Also, depending on the animator’s experience, sometimes the process of adding this bone requires a little bit of back and forth to understand how that bone should be created (should it be a child of the root, should it move with the root etc).
On the other hand, being able to import the animation offset from the origin allows us to simply extract all that information at runtime without extra bones in the skeleton and the animator now doesn’t need to worry about resetting the animation back to the origin before exporting or adding and animating extra bones.
But regardless of all that, we just think that the engine should not be trying to make sense of the content when extracting root motion, the idea of counter rotating the root motion with the rotation in the first frame of the animation was made to try to deal with something that is a content error, prove of this is that it commonly happens when users are importing animations from different marketplaces without really understanding how the animation was created and expecting it to just work. Unfortunately there is no way (as far as we know) of fixing this problem entirely, as you can see one approach fixes something but breaks in some other case. That’s why after discussing this internally we decided that the best option is to not add extra code to the root motion extraction to try to deal with all those cases and instead provide an easy way for the user to fix their animation in the engine. We are hoping to get it done soon, we are thinking that an animation modifier will do it."
We also now have (in UE5 main) a new AnimModifier (ReOrientRootBoneModifier) to fix the animation. However, it has a bug right now if the animation is using a different skeleton. We will likely have a fix for this by 5.1 though. Unfortunately we don’t have an example of how to use it so it may be a bit difficult to understand for non technical people. It basically ask you for a rotation that you want to apply to bring the mesh to face the same axis the ref pose is facing. I don’t think there is a way to do that automatically yet.
Last question. When I blend to false, is there any way when coming back to true to DO NOT reset the sequence animation, because I would like to stay the animation in the same place it was when becoming true, so y can freeze it and come back in the same point it was before, I mean a way to stop the animation and then resume in same place
Yes, we’re continuing to develop and refine Animation Mode. A few of the features in the works are covered in this thread: Ask Unreal Anything: Animation | July 13, 2022 at 11AM EDT - #67 by Jeremiah3D
Thank you for your reply! It is true that my main concerns are covered in the other thread. I will keep an eye out for future updates. A big thank you as well for the work you and your team have already done on control rig and the animation mode, both are amazing features as is!
This is also important to us and something we have been focusing on recently now that the foundational functionality is in.
They will likely have comparable performance at runtime. The question of which to use should come down to the specifics of your use case:
If you’re doing a simple IK modification, and would benefit from a convenient workflow tailored to IK then use an IK Rig and drive it with an IK Rig anim node in the animation blueprint.
If you’re doing retargeting between different character types, you will need to use an IK Rig to hold your characterization data (bone chains) and potentially IK as well.
If you’re keyframe animating, you must use a Control Rig.
If you’re looking to build a sophisticated procedural modification at runtime, use Control Rig.
In the future IK Rigs will be able to be embedded into Control Rigs where they will act as a convenient container for IK setups with goals to be driven however you choose.
We are not planning on integrating Valorant’s engine modifications (such as the recording of animation graph state to enable server-side rollback). We are however taking the requirements on board with regards to planning future animation logic-related features.
Control Rig is continually being improved and refined. We have many plans for where Control Rig is headed and look forward to sharing more information soon. We always welcome workflow and feature suggestions!