How to use Composure with VIVE tracker to move the camera?

I’m trying to do a very simple composure composite where I take a couple of CG components and lay them over a live video feed from a webcam with a VIVE tracker attached. The data from the tracker should move the Unreal camera in real-time.

I tried making a pawn, attaching a Motion Controller and then a Cine Camera, setting it to auto posses player 1 and I get a camera that moves around when I move the tracker, sending it’s output to the main viewport. But when I go into the composure CG element, I can’t select this camera. It doesn’t show up on the list of cameras and when I use “select from scene” it shows up as “incompatible”

If I just drop a cine camera into the scene, the CG element will let me select that, but then I can’t seem to add a motion controller to the cine camera.

The only thing I’ve found that sorta works is to create a tracked pawn drop one of those into the scene, drop a cine camera into the scene and then “attach” it to the tracked pawn. This appears to work but seems kinda clumsy.

I must be missing something very basic here as I’m still fairly new to Unreal, can anyone describe how use setup a vive tracker and camera so it can be used with a composure element?

While creating a tracked pawn and attaching a cine camera seemed complicated, I was able to get it to work and the composure output to go to the screen. Been doing some manual adjusting to try and get the cine camera to match the logitec 920 pro’s parameters.

Things seem to be mostly lining up but the camera is lagging behind the cg by several frames. Is there away to introduce a delay in the vive tracker so it will match up with the video?

I’ve seen this in the mixed reality tool, but I’m not using that.

Hey ,

The best way to synchronize any two external sources would be to compare timestamps. However, given your current hardware configuration I would suggest buffering the pose of your Vive tracker into an array of transforms and using the pose that matches your incoming image (You will likely need to interpolate between poses due to frame rate misalignment).

Hey ,

So the best way to synchronize 2 external sources would be to compare timestamps (Unreal has the ability to do this with LiveLink sources now). However, Given your current hardware configuration I would suggest buffering the pose from the Vive trackers in an array of transforms and then using the closest pose to your incoming image (You will likely need to interpolate between poses due to frame rate misalignment).

Josh, do you think this synchronization can be accomplished with blueprints somehow? I’m a coder from way back but haven’t ventured into writing this kind of thing for unreal yet so I’m not sure where to hook up a piece of C code to do it. Initially at least I think delaying an integer number of frames would be ok.

Also, is there a simple way to lock the render rate of Unreal to match the camera (59.94 hz I think).

Here’s a basic blueprint to illustrate the idea.

You can lock your project frame rate in Project Settings/General/Framerate - which should get you close enough for this delay to work.

This is very useful information, like I said before I have been using a pawn with a motion controller component and then “attached” a cine camera object by dropping it on it. If I understand your sample correctly I should do this by making a pawn to represent the tracker and using a blueprint like this to copy it’s position over to the cine camera every frame, does this sound like I understand you right?

Would I make this blueprint part of the level blueprint or put it in the pawn or cine camera?

I will give it a try tonight and see how it works for me.

Thanks!

Yes, for simplicity sake I made this in the level blueprint (Bad programmer no donut, I know) but in this context it’s probably fine to have a dedicated “Composure” sub-level and handle any communication in the level bp. I haven’t used the trackers in a while but as long as you can get a world space location/rotation from that device it will work.

Make sure you initialize your array with more than enough elements to buffer (10 frames should be plenty). Then play with this offset while panning the camera back and forth until there is minimal skating - you can then tween the closest frames.

Thanks Josh! I setup a test of this last night and I was able to get the blueprint up and running and it looks like it’s working. I have to move it over to my virtual set project to really test it, I’ll let you know how it works out.

With all the new tools coming up in 4.23 I’m wondering if there is some way to treat vive controllers and tracking pucks in the same way you treat mo-cap data coming in through livelink (ie: with timecode and stuff added)

Hey Josh,

This tests out pretty good but I had a problem at first when I tested your blueprint. So I modified it a bit (see picture)

I think with your original blueprint and an array size of 30, you had to set “Pose Frame Delay” to 28 to get a delay of 1 frame, 27 for 2 frames…etc

In the second version I did you should get a delay of 1 with “Pose Frame Delay” set to 1, a delay of 2 for 2…etc.

In testing, both blueprints seem to work ok, the second version is a bit more intuitive and I don’t think it requires you to set the array to any particular size.

However I’m noticing that with frame rate set to 30, a zero delay isn’t enough and 1 frame seems to be a little bit too much.

I could tween between the last two VIVE tracker samples to get a better alignment of the tracker data with the video, but it seems like it would be better if I could get get an array of timestamped VIVE tracker data at its actual update rate (or any rate faster than UE’s frame rate), get the timestamp of the current tick (UE frame start) and just go through the array to find delayed data that’s close to the delay I want. Not sure how to do this though. Do you know how to:

  • sample the VIVE tracker faster than the UE frame rate and put in an array with timestamps
  • find the actual time when the UE frame starts rendering (realtime when the tick fires)

I suppose an alternative would be if there is some other way in Unreal to delay the tracker data, or an external tool that buffers/delays the tracker data before it gets to Unreal. I don’t mind writing some C++ code to do this if someone knows of some example code that gives me some idea of how to hook into the tracker data.

Also if there is a way to synchronize unreal’s render starts with an external video source (ie: genlock) that would let you sync up the video camera and CG so that only integer delays would be needed.

I’m studying the docs to try and figure this out, your pointers have helped the whole process of figuring it out go a LOT faster!

Thanks!

Hi,

I am very new to unreal engine, i am trying to get live camera data from motion builder to unreal through live link. So as i have created blueprint and getting data in unreal. but when i go to composure CG element showing camera missing. can anyone help to setup. thank you? **JoshKerekes, .Corson **could you please update on this?

Hey sada_uw,

Is your “livebp” actor in the world? Are you able to see the camera component receiving animation? In the details pane of your cg_element you should be able assign a soft reference to your camera actor in Composure/Input/CameraSource

In 4.23 they’ve done a lot of work on LiveLink and a QOL pass - I believe they will be updating their documentation as well when the stable build ships (Hopefully next week!!). so maybe we should circle back then?

Hey Josh,

Do you know if there is a way to sync Unreal’s frame rate with a webcam? I know the “pro” cards like blackmagic and aja can do it but Unreal seems to support that feature only on those cards. I hunted around last night for awhile, but didn’t see anything that looked like a way to sync up with a webcam or other video capture card.

That’s going to be tricky - You could try using the incoming images that you receive as the pulse to genlock everything and V-Sync the engine. Take a look at UAjaCustomTimeStep::WaitForSync()
for how this is implemented.
You could also try replacing the audio I/O of your webcam with a timecode source and then feed the same source into unreal and try to sync up that way.
Honestly, I would just set a fixed framerate in the project settings to closely match your webcam - They will surely be out of phase but if you can align the pose to the images closely enough then it should look fine especially at 59.94 and only slip occasionally I think.
The most important part is matching the horizontal field of view on the virtual camera to the AOV of the webcam.

Hi again,

I got some preliminary results and posted them if you want to have a look. Not perfect by any means, but it’s a first try. There is no attempt to match lighting or do lens calibration, I’m just using cine camera settings that match my DSLR setup. Video is coming in at 1080p60 from a Sony A7R dslr with 24mm lens over an Elgato camlink 4k.

Github project for those who are interested 3.8gb because it includes the Unreal Free “Gadget” character (is this ok Epic?) Don’t expect too much as I’m still learning unreal. The “Animation Test Map” has everything in it.

Aside from the things I mentioned earlier, there are a few things I’d appreciate help on if anyone has ideas

  • When I record with OBS, it only works if I do “new editor window PIE”, and even then, if my mouse moves over some editor icons, the OBS window starts showing unreal tooltips instead of the video. Is there is a better way to set this up?

  • When I try to run the app standalone, everything seems to work except the tracker does not connect to the cine camera like it does in the editor. How can I fix this?

  • The cine camera matches my DSLR lens pretty well, but is still a bit off. How would I go about entering lens distortion parameters? This is a Sony A7R DSLR lens so I expect details are in a database someplace (I know Adobe Camera Raw has it) I just don’t know how to get it into unreal.

  • Any idea how to go about implementing a “shadow catcher” so I can composite in Gadget’s shadow on the floor?

I’m hoping that with a little help from everyone here I can put together a good example of how to do virtual sets at home with Unreal and common hardware.

I was going to use a Blackmagic Intensity Pro 4k card for this, but it doesn’t seem to be compatible with ANY of my cameras including the Sony DSLR, Sony Action Cam and Gopro Hero 4. Major bummer because using this card would improve the frame-sync issues I mentioned in previous posts.

One more sample, I attached one of Gadget’s robots to a tracker so I could hold it. Tracking is pretty good, I can even throw it up in the air.

I have composition working in the scene but once i press play compositing no longer works. How did you get compositing working in PIE

Westcut…honestly I’m not certain. Feel free to checkout the project from Github and have a look. I did set the composition camera to “auto posses” which might be part of it. But as I said earlier my trackers no longer work when I try and do it as a stand alone game. Some things still seem kind of sketchy, when I made a pawn with motion controler and cine camera all together, composure wouldn’t recognize it. I ended up making a tracked pawn and a separate cine camera pawn, then copied the transform from one to the other with a blueprint.

One thing I noticed is that the video seems a bit fuzzy, I’ve not had this problem with OBS recordings before so I’m wondering if it has something to do with Unreal? Normally things I record with OBS seem to be tack sharp. Everything was done in 1080p60 so it should be much sharper.

I am going to try to make a smaller project demoing this, gadget is a nice asset but adding her makes the project huge (3.8g). I’ll find some simpler assets and make a smaller sample soon.

Well for me it is more on the lines of, when i press PIE it just kills the feed all together so every time i stop playing i have to re add the camera to the media player. Same issue with my project and even yours when i download it

So i got it all working now. Will be getting a vive and tracker soon then i will test that and let u know how it goes