Can VR motion controller be configured to control animations on a character independent to headset?

Hi, apologies if the question seems unclear. Can anyone shed some light on whether I can set up a motion controller to simply play different animations on a rigged character, and not be tied to the position of the headset? The headset wearer is observing and speaking only, and since they’re in the headset, I might as well use the motion controllers to perform actions they can see, and respond to.

Think of it like a VR gameshow, where the contestant is wearing the headset and is in a static position. The controllers are in the possession/use of a second actual person, playing the host and in the same room. The gameshow host is represented by a Mixamo character, and is voiced by the second person, not far from the contestant in the headset. The host asks questions and the contestant in the headset answers. Based on the answer given, the host chooses an animation and presses grab, trigger etc, and the Mixamo character responds. General animations are sufficient, I’m not going down the Livelink path of face tracking, etc. I just want nodding, idle, falling over, shaking head, that sort of stuff.

Up until this point I figured the way to go was to approach it like swapping out motion controller hands for actual hands with blueprinted animations, then attach them to the Mixamo character skeletal mesh. Is this how other people would approach the concept??
Is there a way that I can simply use them as an animation controller for a static character, instead of a motion controller? Would I set them as replacement “hands”, then remove reference to location in the blueprints so they no longer respond to physical motion?

Thanks for any tips, just trying to see if I’m on the right track or if I’m way off and there’s a better way to go. Cheers

Hi Phil,

So, from what I understand you want to use motion controllers as a sort of ‘remote control’.

For the ‘virtual representation’ of the motion controllers, it would be easy to set ‘Hidden in Game’ on the mesh. (Would avoid issue of breaking dependencies if using a VR template project)

I can imagine a BP_Host actor. (Basically just a Blueprint Actor with a Skeletal Mesh component) Then in the ‘Class Defaults’ for this Blueprint - set ‘Auto Receive Input’ from Disabled to “Player 0”

Now in the BP_Host you can create motion controller input events to play certain animations on the Host.

Let us know if anything is unclear, welcome to the Forums!

Hi, thanks heaps for the info, much appreciated, yes a remote controller would be the right term for what I’m trying to achieve. Perhaps I’m not setting the VR controller button inputs correctly, I’ve been trying to reallocate the grab function thru the existing blueprints under the VR template, and basing my attempts off of the hand animation tutorials online but so far not much is going on. I’ve got my Mixamo character performing animations when simulating but not cycling through different animations and not when buttons are pressed so much more work to do!

If you’re using the VR template, there are a lot of input bindings already set (Project Settings → Input ) so try using those.

On the input nodes there is a ‘Consume Input’ bool, your inputs might be getting consumed in the pawn. (The way I mentioned in my previous post was the easiest implementation, but it’s generally ‘best practice’ to have all your inputs in the pawn. (And then you’d have a reference to the bp_host that you want to send commands to.))

I’d also bind some keyboard inputs, that way it’s easier to test without needing to put on the HMD.