I recently had the idea of using Unreal Engine and the HTC Vive to experiment with puppet-style animations.
While I’ve had a lot of success blueprinting for the Vive, I can’t seem to find a way to make it play nice with Sequencer.
I could technically just capture the raw footage from the Vive and edit that into a short film, but what I’m looking for is a way to capture transforms of all the stuff I move in a session of VR (as well as a camera blueprint attached to the HMD) and be able to finesse the keyframes in sequencer after.
Sequencer seems to stop recording as soon as it starts when playing in VR, which I think may be linked to my inability to find a way to play VR in viewport.
I know this is all new and experimental stuff, but does anyone have any experience with this?
Now that 4.12 has been released and updated a couple times, I tried again and was able to find a method of doing this that works.
For future people that might want to do this:
Set up a VR pawn the way you normally would (I’m using an HTC Vive, so I have motion controllers and a headset with roomscale, so my VR Pawn includes a Steam VR chaperone and a Motion Controller component for each hand with a mesh child under each for visualisation.)
Create a setup in blueprints to set the transforms of an object to the transform of each motion controller and/or the headset on every tick. Do not attach or otherwise parent the object to the motion controllers/headset or the keyframes will not be recorded. The object taking transforms from the Motion Controllers or Headset can be an actor in the level or a component in the VR pawn itself, which is simpler.
Open the Sequence Recorder (Window>Sequence Recorder)
Play the level in VR (The following steps MUST happen while playing in VR, as far as I can tell)
Shift+F1 to enable mouse control of the editor while playing, navigate back to the Sequence recorder window.
Click +Add, select the new recording, go down to “Actor to Record” and choose your VR pawn if the object taking transforms is a component in your pawn, or select the actor that is otherwise taking the transforms.
Hit Record and navigate back to the PIE window, allowing you to take control in VR once again. You should see the onscreen countdown and should be able to record keyframes onto the objects you’ve blueprinted. Once you quit or the time runs out on the recording (I think it’s based on the sequence length in the recording parameters) your animation will compile, and you will be able to open it in sequencer.
You can also copy/paste the keyframes, though the sheer amount of keyframes involved does get heavy for Unreal to manage.
Sort of: you can use the captured keyframes of an object copying the transforms of the headset to drive a new camera in sequencer, then set the FOV of that camera to whatever the Vive’s FOV is (I don’t know it off-hand)
Hey Shrodo thanks for this , Ive managed to follow steps 1-6 and bake out some keys from the camera, but now I have no idea where the animation data went, or now to apply that to a new camera. As a total noob, Im going to battle my through this, but in the meantime if you could expand point 7. at all that would be really great, Im so close to getting this working! thanks Tom
I think I need to read up on blueprint some more, not sure if I should be setting this up in the level blueprint or the pawn blueprint, and where should the ‘cube’ (the object we bake the keys to) be defined? in the pawn blueprint too?
I have the cubehead object as a component inside the VR pawn blueprint as well as the set world transform stuff.
If you’ve recorded the sequence correctly, you’ll see the editor compiling the data for the capture as soon as you’re done. It will save it as something like “Recorded level sequence_1” inside the Cinematics>Sequences folder in your content browser. If you double-click that sequence, it should open up and you should see all your baked keyframes.
here is what I see if I double click the sequence, not sure if this is baked keys but it looks like it might be, but not sure how to get this to play back in the viewport, I set the viewport to show the sequence!
I think my problem might be no knowledge of the sequence editor now, finding it very confusing! alt text
another update, In the sequence editor if I make a new camera Im able to copy and paste the keys onto it and get the motion I need, but if there were an automatic way to apply the cubeHead transforms to a camera in there that would be great. Im planning on doing a lot of HD video capture.
None of these workarounds seem to be necessary for UE 4.13 any longer. I just tried with 4.13 Preview 1 and you no longer need to pipe transforms to different components. The camera and controller parented meshes are recorded as expected. What’s more, you don’t get flooded with an excessive amount of keyframes as with the per tick transform adjustment. And the headset camera is recorded correctly so you can see gameplay as viewed through the headset if you lock the viewport to the recorded camera. Thanks Epic!
hey thanks Morni, Im still playing with this, and I see the sequencer has loads of new functionality in exactly this type of thing, so I’ll check it out. looking forward to being able to use the Vive to capture animation to export to maya, Im thinking the new export stuff is going to make that easy to do. nice one!
hey morni you couldnt give me a quick overview of your new workflow could you please? there isnt much in the way of documentation for 4.13 is there, and the sequencer is still confusing me as Im a total UE4 noob (but Ive used Maya for years so hoping to learn fast)
ok heres my latest problem, Im so close now but the transforms are coming in skewed by something.
basically I appear in the world at the player start which has a location of -1130 and Im recording this, so Id expect the baked keys to be something like that, but as you can see its more like +1059 for the x location.
its like I need to zero something to the origin somewhere maybe?