Download

How Can I control a blend space LIKE THIS?

Im trying to grab an enemy by the head in vr and pull them about a bit.
I want to recreate this system

I have created the blendspace with the enemy leaning in all 4 directions & Im passing the world transforms of my motioncontroller to the enemy animation BP.
But Im unsure how I can make the blend space lean in the direction of my hand OR stagger in the direction of the throw

The lean direction is always known? It’s just the vector of Look At Player hand inside of the thrown object.

“Thrown” Direction would be different, but similar. You just “write” it down when the throw happens and have the actor use it/discard it accordingly.

PS: 3d blend-space? Isn’t that a lie? you only have 2 axis.

Not a Lie But It’s correct, I’m only trying to move the enemy’s head in x plane at the moment of y & x world space.
Im passing the VR Players hand world transforms, to the enemy BP and sending that to the enemy’s Animation BP.
I have done blend spaces before where I update based on the axis input. But its a bit murky how I could do that from a world location of my hand.

Well you don’t, pass a vector of the direction. Or even better solve between 0 and 360 and pass that.

It’s basic trig so it’s easy for the actor bp to do, and it’s 0 to 360 so it’s easy to implement inside the blendspace.

To solve the angle you just do Dot between the 2 and AcosD.
as you probably have done a billion times before.

I would even go as far as to assume that the direction the actor was thrown is 180° opposite to the angle given by the location of the hand in relarion to the thrown actor.
it’s an assumption, so you know how those usually go…
Still, momentum usually carries and if you have newtonian physics in place (vacuum, no drag, no drift) this should be a safe assumption.

Thanks But I’m still learning programming and UE4 is my first EVER introduction to coding and I have ever only used BP.
Im still trying to understand some concepts and I have yet to use a Dot for anything yet, But from a quick bit of research its used to get the degrees between 2 objects. So some of this has gone over my head If you have time could you show me how this could work Mainly the using the location of my hand location to drive the other actors Blend space

take the end point vectors, Normalize. Place them in a Dot node, drag from dot type acosd
Print string to see the result.

Depending on who A or B is in the dot you’ll get a different value. Go with whichever look like what you expect from testing.

And I would only run this on the thrown actor, since it needs to know where the hand is compared to his own forward vector.

Look at rotation node may be a viable way to get the rotated vector, but it doesn’t have an output for for Degress you would need to set up the 0 to 360 blendspace val.

end point vectors?
The world locations of each actor?
Sorry Not sure I understand.

yes, whatever the end points are.
Locations from actors in this case.

Thanks for trying to help me out, But I have not been able to get this working yet Prob due to my lack of understanding with some of the areas around getting degrees . But I did some more research and think It might be possible to do this with a get look at rotation and feeding it in to a calculate direction node.
Or getting both the component velocity and transform & inverting them to the local space of the character I want to make lean in the direction of the component with the characters Lean blendspace.
I just need to figure out how to feed these to the vertical / horizontal values in the blend space to make the character lean towards the component.

The Z angle is
AcosD(actor a forward vector dot actor b forward vector)
Both actors vectors should be normalized.

You can substitute the forward vector of B for the vector of the throw. Since that’s retained as the direction upon landing.

Does that help?

Not sure if I set this up correctly

Only thing I can think of is that Player Hand forward vector is the same as character forward vector when you play an animation.
it doesn’t change with the actual rotation of the hand?

Never used the vr template, so not sure how it handles that.

Maybe just get a socket location/forward vector off the hand. So it’s always just moving with it…