I am trying to implement an interaction that makes the player’s character press a button on the wall. I have an animation that does that, but on a fixed position. I want to somehow guide the hand towards where an actual button exists (an actor that I can get the location of).
Although there are tutorials for this kind of thing, I am learning and prefer to use this “real world” problem to come up with my own solutions and right now I need a hand.
I am looking for guidance, advice, or maybe someone telling me I got it all wrong and hopefully point me in a direction that will help me build a correct workflow.
Current setup is:
- Character with a skeletal mesh
- Animation blueprint with logic for ground locomotion
- Control rig I created myself
- It has controls for the arms, the poles work perfectly, everything works
- Retargeted animation for pressing the button on a wall
- Animation montage of the pressing animation
- This animation only happens on the upper body slot
- The montage is split into two sections: from idle to moving the finger into position, from the pressing position back to idle
The button on the wall is an actor that has a scene component where the hand is supposed to touch and it passes that scene component to the character on overlap (which is when the character is close enough to the button so the arm can reach).
The character passes the location of the scene component to the animation blueprint, that passes it to the control rig.
Additionally, there is a boolean that lets the control rig know that it should move the hand to the button. (after everything is working I will reorganize my animation blueprint)
Right now I can make the character touch the button immediately by using the full body IK node. Not exactly there yet, but it’s a start.
The character blueprint looks like this:
Just a side note: the skeletal mesh is called
First Person
because I have a system to toggle between first and third persons, the first person is casual clothes and the third person is an EVA suit. But it is not in use at this time.
The Set Pressable Target Location
stores where the hand is supposed to go.
Set Pressable Target Location
is what tells the control rig that it can move the hand.
The montage plays the animation up until the position of the screenshot at the start of the post, and is set to not auto blend out:
If the trigger (Set Pressable...
) happens before the montage, the character punches the button and immediately starts the animation, which is expected.
If it is after, nothing happens.
From what I read online this is because the montage is still… “controlling” the model, but couldn’t find anything elaborating on that.
TL;DR:
- Can I influence pre-existing animations using a control rig and IK?
- if yes: how?
- if not: what is the correct approach?
- do the entire thing procedurally
- put the buttons at the height the hand touches and force the character into a position and rotation that the finger matches exactly the button (not really what I wanted to do)
This one example, but I want to learn the correct workflow because I am already looking towards more interesting applications.
Thanks in advance!