Procedurally tweaking animations in real time using control rigs

I am trying to implement an interaction that makes the player’s character press a button on the wall. I have an animation that does that, but on a fixed position. I want to somehow guide the hand towards where an actual button exists (an actor that I can get the location of).

Although there are tutorials for this kind of thing, I am learning and prefer to use this “real world” problem to come up with my own solutions and right now I need a hand.

I am looking for guidance, advice, or maybe someone telling me I got it all wrong and hopefully point me in a direction that will help me build a correct workflow.


Current setup is:

  • Character with a skeletal mesh
  • Animation blueprint with logic for ground locomotion
  • Control rig I created myself
    • It has controls for the arms, the poles work perfectly, everything works
  • Retargeted animation for pressing the button on a wall
  • Animation montage of the pressing animation
    • This animation only happens on the upper body slot
    • The montage is split into two sections: from idle to moving the finger into position, from the pressing position back to idle

The button on the wall is an actor that has a scene component where the hand is supposed to touch and it passes that scene component to the character on overlap (which is when the character is close enough to the button so the arm can reach).

The character passes the location of the scene component to the animation blueprint, that passes it to the control rig.

Additionally, there is a boolean that lets the control rig know that it should move the hand to the button. (after everything is working I will reorganize my animation blueprint)


Right now I can make the character touch the button immediately by using the full body IK node. Not exactly there yet, but it’s a start.


The character blueprint looks like this:

Just a side note: the skeletal mesh is called First Person because I have a system to toggle between first and third persons, the first person is casual clothes and the third person is an EVA suit. But it is not in use at this time.

The Set Pressable Target Location stores where the hand is supposed to go.

Set Pressable Target Location is what tells the control rig that it can move the hand.

The montage plays the animation up until the position of the screenshot at the start of the post, and is set to not auto blend out:


If the trigger (Set Pressable...) happens before the montage, the character punches the button and immediately starts the animation, which is expected.

If it is after, nothing happens.

From what I read online this is because the montage is still… “controlling” the model, but couldn’t find anything elaborating on that.


TL;DR:

  • Can I influence pre-existing animations using a control rig and IK?
    • if yes: how?
    • if not: what is the correct approach?
      • do the entire thing procedurally
      • put the buttons at the height the hand touches and force the character into a position and rotation that the finger matches exactly the button (not really what I wanted to do)

This one example, but I want to learn the correct workflow because I am already looking towards more interesting applications.

Thanks in advance!

You may not need the full video, but the logic can be gleaned from here:

Open Doors In Unreal Engine Using Control Rig (Part 1)

Yep! That really looks like it is.

The hand logic is done in part 4 (https://www.youtube.com/watch?v=N2T_9YfuH1g).

Despite having 0 explanation of anything he’s doing and the graph having 0 organization or structure I think this might solve my problem.

I’ll give it a try later and will come back to report the results.

Thanks a lot!

Yeah, it’s not the greatest tutorial, teaching-wise, though it’s the only of its kind that I could find for picking up objects with hand placements (and at different heights). I can confirm, though, it works well when conformed to a specific application. Godspeed.

The videos did not help at all because there are literally no comments whatsoever on what is going on.

But I did figure out what was wrong and it was a silly mistake: the slot where the animation was played was evaluated after the rig, which was overriding everything the rig was doing.

The solution was to rig after the slot:


I will write what I learned and how I solved it in case anyone has the same problem in the future

Step 1: create the interactable actor

The actor should have:

  • Static mesh (the object)
  • Box collision to start the interaction
  • A scene component located where the hand (or other body part) should touch (PressTarget)
  • A scene component located where the character must move in order to be able to reach the object (PositionTarget)
    • don’t worry about the height of this component, just position it in the correct XY location

On the actor’s construction script, do a line trace downwards from the PositionTarget and place it at the Z value of the trace hit. This way you have a generic actor that you can place how high or low you want.


Step 2: overlap event

Here simply create an interface, implement it on your Character and add a function that the actor can call to send the positions from the two scene components.

Additionally you can add some logic to allow the button to be pressed more than once.

With this data on the Character, it is up to you to trigger the interaction however you want. By pressing a button, by just walking to it, your call.


Step 3: get in position

In my implementation I force the character to get into a specific position (the PositionTarget’s position) and rotation so that the animation aligns well.

I tried the Simple Move To Location node, but it wasn’t precise enough. The character wouldn’t get to where I needed it to go. So my solution was:

Make a timeline and interpolate location and rotation using Set Actor Location and Rotation. The interpolation function got quite a lot of nodes, so I won’t add a screenshot, but it was basically:

  1. store actor location and rotation in variables
  2. create a timeline that went from 0 to 1 over 0.5 seconds
  3. lerp vector (initial location, PressTarget location, alpha from the timeline)
  4. use Find Look at Rotation to get the rotation where the character should face
  5. lerp rotator (initial rotation, look at rotation, alpha from the timeline)

This should run on the Update node.

Bonus!

If you animation blueprint triggers movement animation based on the Character’s Movement Component’s Velocity then your character will slide into place doing the idle animation.

To prevent that you can:

  1. create a boolean in the character named something like IsBeingMoved
  2. when you start interpolating the position, set it to true
  3. in your animation blueprint where you store the Velocity, check if character IsBeingMoved
    3.1. if false, store Velocity
    3.2. if true, do (CurrentPosition - PreviousPosition) / DeltaTime and store how fast it is moving
  4. when you finish interpolating, set IsBeingMoved back to false

I don’t know if the screenshot will be readable, but it is something like this (in the animation blueprint):


Step 4: animate it

When the movement timeline ends, play your montage and at the same time use a timeline to feed a value from 0 to 1 and back to 0 that will be used as the weight for your control rig transformations.


Step 5: rig it!

I created an interface and implemented on my animation blueprint that activates the rig and passes data to it. Since it’s quite a lot of stuff I created a struct to better organize things, so this allows me to customize well the transformations and reuse this later.

Then used the data to conditionally select which hand should move and/or rotate and then apply those transforms.

The screenshot might not be readable, but just showing that having all the data packaged makes it easier to work with it.


Step 6: profit(?)

It worked out better than I expected and I like it that it is very flexible. You still have to do some fine tuning based on the model and the animation, but I am excited about how easy it is to tweak animations.