How did the developers implement a procedurally generated environment ? The rooms in the game change every time, how do they do this ?
Player animation
What did they do to get the players hand to grab the door handle and open it ? I know its an animation but how do they make the animation sync so his hand grabs the door handle every time? The same goes for cutscenes that are in first person, how do they get the players hand to sync up and grab the desired objects ?
Any tips or insight into how these processes work would be greatly appreciated !
Usually procedural means that it just randomly combines pre-made level pieces. So it has rooms and hallways and stuff and it just attaches them together.
For the animation, most of the time for that type of thing it moves the character to the correct position and just plays an animation. But there’s actually tools so that you can retarget parts of the animation, like Ikinema, where it could take a regular animation of opening the door and then be able to make it so that the hand always positions itself to the door, regardless of where the character is positioned.
Ah man, that game sounds a lot like mine LOL, only I’m going for a more Doom/Quake style gameplay.
I did procedural environments by randomly telling rooms to spawn relative to other rooms. And the interiors of rooms can be filled with random things. In the end, it’s all incredibly complicated and I had to cut corners on a few ideas since I saw that what I had was good enough and still let me create what I wanted without getting so complex that it was unmanageable to even create content for it. I’m hoping to have something cool to show soon once I create good enough content.
The hand holding the door part is likely done with inverse kinematics. It’s the same concept that is used for placing a character’s feet correctly when standing on slopes and stairs. It knows what position the door *** is in the world. There’s a open door animation that plays. It plays the door knob opening animation, but also forces the character’s hand to be at the door knob using inverse kinematics just like a character would play a walking animation, but the feet would still be repositioned correctly.
In unreal there are skeletal controls that are part of the animation graph.
I myself did this to create one two handed weapon holding animation, but am reusing it for all varieties of two handed weapons in the game. I tell my character’s left hand to move to the correct spot on the weapon but the underlying animation still plays. You can add an influence weight onto the skeletal control that creates the inverse kinematics effect so the animation doesn’t immediately snap to forcing the hand to the right place. When a character is grabbing for a door knob, use an animation curve to control the influence. So the hand starts out uninfluenced, and slowly moves towards the door knob over the course of the animation. I did the same thing for my weapon holding animations. In my weapon take out and put away animations, I have an animation curve that forces the hand to slowly reach for the correct spot so that by the time the weapon take out animation is done, my character’s hand is in the right place.