How to use Composure with VIVE tracker to move the camera?

Was the issue the part of the level blueprint that gets the camera feed? I apologize but that is hardwired so you need to open up the camera feed manually, copy/paste the URL into the blueprint and the track number for the feed you want to use. I haven’t figured out how to do this automatically yet.

You may want to get more than 1 vive tracker, you will use one for the camera and will need a second one if you want to track an object, though if you have a full vive setup, you should also be able to bind objects to the VIVE controllers, though I haven’t tried this.

When you are happy with how your project works, please consider posting it to Github or at least telling us here about any new things you have learned!

Actually no, I had changed the URL first thing my issue was how i set up the material pretty silly actually, i hadnt assigned my material to the media plate and then i had parented the fg and bg elements to the media plate

I tried the virtual camera plugin tho and that worked without issues so thats a positive when i get the complete vive set then i will test out the tracking. Aside from the vive i am going to try other trackers smaller chips that can just be attached to an object

What types of trackers are you going to try? I’m interested.

When you start tracking objects/cameras the main issue is that you have to delay the tracker data to make up for the frame or two delay in the video. That is in my blueprint but the way I do it is not ideal. Really need something lower in the engine that delays all the control and tracker inputs by the same amount.

Currently checking tracking chips that i can jus attach to objects but for the immediate future more than likely a Vive, tho its not really practical considering the size, I like how the vive tracks everything including orientation so that is something to consider when getting one. For now its just searching companies to see what has the best options

I’ve got a couple of questions regarding the tracker data so it syncs up with live video and CG.

I was looking at the MR plugin sources and saw their tracker delay code that seems to be attached to Engine/Source/Runtime/HeadMountedDisplay/Private/MotionDelayBuffer.cpp
This seems to be part of the main game engine, is it always active and can it be set manually so all the tracker data is delayed?

Also, if you are using LiveLink mocap data along with a camera and pro video card that supports timecode, does the video, cg and livelink data all get synced up by timecode automatically or do you have to do something?

Here’s another experiment Tracked Light Test - YouTube this time I’ve put a tracker on a flashlight, then did some filming in a dark room with that as the only light source. This is very basic with no shadows or bounced light, but still looks pretty good. Updates (the “lighting test map”) are on github GitHub - MiloMindbender/GadgetTest: Test of virtual sets and tracked cameras the lighting test map uses special 1 for the camera and special 2 for the flashlight.

This uses a standard LED flashlight in the real world and a “spotlight” in unreal as the tracked light. I also tried using a small photo 14cm x 9cm photo light panel as a light source and a “rect light” in unreal, but this didn’t look so good because the light panel generates a very wide diffuse beam which lights the area too well. The flashlight gave a better beam of light so you could see it moving around.

I need to work out how to setup shadow catcher planes so Gadget can cast shadows on the floor and walls. Also need to figure out how to make the in-game light simulate light bouncing off the walls and floor so the self-shadows on Gadget aren’t completely black.

Anyone got advice on how to do it?

Did anyone check blackmagic intensity pro 4k with unreal blackmagic plugin?
I have tried with pro 4k card but unable.

Hi sada_uw,

I did try using the Blackmagic Intensity Pro 4k, the blackmagic plugin does not seem to recognize it. Pretty sure I tried it as a WMF source too and that did not work either.

Also, I found that the Intensity Pro 4k does not work very well with some cameras, GoPro Hero 4 sometimes works if you have a very high quality cable, Sony A7R and Sony action cam did NOT work at all. Blackmagic says this is a known problem with the “HDMI handshake of some cameras” and can’t be fixed. However these cameras worked fine with many other capture devices I tried.

For the demos I’ve posted, I’ve been using an Elgato Camlink 4k connected to a Sony A7R DSLR by HDMI. Sometime in the next few weeks I will also get an Aja card to try and will test it with a number of cameras.

For some reason both the unreal CG and the live video I’ve recorded don’t seem as sharp as they should, I need to go over the entire setup to figure out what’s causing this, probably something simple I’ve missed.

hi Thank you. Did you check any other blackmagic decklink card. I am planning to purchase “Blackmagic DeckLink Mini Recorder 4K”. does it support unreal?

I have only tried the Intensity Pro 4k.

So far the Elgato Camlink 4k has been the only “cheap” solution that worked right with unreal, but since it isn’t a pro card it doesn’t support things like genlock and frame sync…not sure if it supports timecode in unreal because I don’t have a camera that generates timecode right now.

One more sample https://youtu.be/D5kg7LlRd0I this one has a BattleTech mech in the parking lot outside my window. Not too much going on in this one because it was shot late on sunday so not much in the background was moving. Also still working on video quality issues…the VIVE tracker also seemed to be having some trouble with this placement of the camera. May need to relocate it to get a better view of the lighthouses. Had a choice of filming this through dirty glass or threw a screen, not ideal.

Though the point of it is to experiment, the object of this is that the camera (up in my room on the third floor) and the mech and gadget character (by the mech’s feet) are located a good 10 meters below the camera and around 20 out. So if you have an accurate camera position and measurements you can place very large CG objects quite a ways away from the camera and it still looks correct.

The zero point of the level is a bit behind where the camera is, and there is an invisible “floor” down 10 meters where the parking lot is. With this setup you could drive a CG car around the parking lot if you wanted.

A couple more demos up on my channel here https://www.youtube.com/user/GregCorson one shows the “gadget” character standing next to me with much improved lighting. The virtual lights match the real lights in the room and the shadows and hair highlights are pretty close now. It needs a bit more tuning to get the ambient light level right, you can see the shadows are a bit darker on “gadget” than on me because of this. I’m also holding a tracker that is getting replaced by a big cartoon hammer. Tracking is very clean and synced up well, delay is about 6 frames.

This is using an Aja Kona HDMI card, genlock on with a 60hz camera and renderer framerate. This should be workable with a USB capture device or webcam but without genlock the real/virtual sync may not be as exact. This is NOT using any kind of mixed reality plugin. The real and virtual cameras are setup to the same settings, then adjusted slightly to match. Both the real world and cine cameras are set to full frame DSLR 16:9. The lens on the real camera is 35mm, to make it match the Unreal camera had to be set to 36.5, not sure why. I did calibrate the camera/lens but it turned out to have so little distortion that it really doesn’t make any difference. Need to try this with a different camera like a gopro and see how well the function to remove lens distortion works.

The project is here GitHub - MiloMindbender/GadgetTest: Test of virtual sets and tracked cameras but to use it you will have to adjust the video and light settings for your environment. Beware it is over 3gb because the “gadget” assets are very large. Also the first time you load it up it will seem to hang around 45% but it’s really just building all the shaders, give it time.

There is also a sample using the unreal “Virtual Studio” demo project, same camera setup but this one uses the greenscreen.

Also there is a mixed reality sample using the LIV vr software and BeatSaber…no unreal here, but it’s interesting. I didn’t matte out the edges so you can see how small an area you can have and still do this.

By the way, none of these are using any extra lighting, just normal room light.

Hi ,

I’m trying to create a similar setup on the cheap and have been using your github files and info a lot. It has really helped me move through this process so thank you very much. I’ll post info on my setup as soon as I’ve ironed everything out.

I just got the BM Decklink Mini Recorder 4k and it does work!

This is REALLY good to hear! What cameras have you tried it with? My BM intensity pro 4k (does not work with unreal) had problems with Gopro Hero 4 and all my Sony cameras.

Also would love to hear if other features (like genlock, timecode and lens-un distort work with this card).

If you want to see my new sample It is here GitHub - MiloMindbender/UE4VirtualProduction: An example Unreal Engine Virtual Production Project there is NO documentation yet but I am working on it. It is not a photorealistic scene but it is WAY smaller at just 13mb (the old one was 4g) so easier to download.

I am using AJA Kona-HDMI video card, would love to hear how it works with the BM, should be pretty easy to convert over.

If you want to talk directly, send me a PM and I’ll give you my email.

A summary of what I’m trying to accomplish, my setup, where I am so far, and what I could use help with:::

Background: I’m a filmmaker/vfx artist trying to use VP via my Vive to track the camera and for things like virtual scouting and realtime comping of actors on a green screen stage for reference on set. I then plan on rendering out frames and doing more polished compositing in AE/Nuke later since I’m not confident in my ability to pull a quality key/composite via blueprints. I’m hoping to do the indie diy version of virtual production with no budget for a webseries/passion project. I am a UE newbie so bear with me please.

Setup: I’m using a Black Magic Production 4K Camera through a BlackMagic Mini Recorder 4K via an SDI cable. I am also using a Vive and I currently have one tracker puck mounted to my camera.

Where I’m at in this process: I’ve managed to get a composure setup working with a tracked camera via the vive tracker. The green key is pretty rough, for some reason my camera feed is very noisey in UE4 despite it looking crystal clear in the Black Magic Media Express. This is fine for me as the key is simply a reference for my purposes but is this normal?

I am also bringing in timecode and genlocking UEs output frames to the BM camera which is great. I’m not jam syncing any timecode via other hardware so I’m stuck with the Time of Day my BM is producing but I’m mostly hoping this smooths the tracking discrepancies between UE and the camera footage. Because I’m rendering frames out later, I’m recording the camera movement via Take Recorder and then rendering them out after the shot has been recorded IRL. But this causes some issues (see below). Also I tried to calibrate my setup via the MRCalibration tool but it doesn’t recognize my BM Camera via BM Mini Recorder as a viable input. I have not gotten to lens un-distorting yet. Which leads me to…

Obstacles:

  • How can you calibrate without MRCal tool and how do you account for rig offsets and align everything? I’m currently using an Empty Actor which takes the transform data from the Vive Tracker via BPs and I child my Comp Camera to that where I manually offset transform data until my controllers roughly line up to my camera feed but it’s very rough and tedious as hell. Any tips here for faster and better results?

  • How do you prevent trackers from disconnecting all the time? I have to restart Unreal for my Motion Controller BP to recognize it as ‘Special_1’ every time it disconnects and it’s a real pain. I’ve tried the

  • Is it possible for Take Recorder to record timecode the virtual camera is syncing with so when I play a camera back later it holds that previous TC?

  • How can I render out a camera that’s offset by a recorded empty actor? My current setup I can tie the Vive tracker transform data to my camera but it’s not offset properly to match the real camera. So when I render that camera it’s staring at the sky for instance. If I record the offset empty actor then the camera is static. Not sure how to render out the combination of these two things as they appear to be all separated in Sequencer/Take Recorder. Is there a better way to do this?

  • @.Corson I’ve tried to use your latest VP project but for some reason I can’t even get BM Media Bundle to display in this project. I used the same settings within the BM plugin on several other projects but this one is giving me trouble. I only fired this up for a few minutes so I’ll try again soon, I’m probably missing something obvious.

Thanks again for everyone in these related threads, it’s been incredibly helpful. If anyone has any insights on the problems above I’d be grateful.

Not sure why my VP project is having problems with the BM plugin, you will have to make a new mediabundle for the BM and plug it into the live background plate in composure, that should be it. I think you have to drop the mediabundle into the level somewhere also or it just doesn’t play, give that a try and see if it helps.

I got around the tracker issue for now by plugging them into USB, then they never shut off. There is supposed to be a file where you can change the timeout but I haven’t hunted it down yet.

Regarding the MRCalibration tool…I don’t use it…I setup a cinecamera with the same parameters as my real camera and it nearly worked, had to set the cine camera to 37mm for my 35 mm lens, other than that didn’t do much, be sure the image sensor size is exactly right for your camera, don’t trust the presets… My 35mm lens is very flat though, if yours has distortion you may want to calibrate the lens and “un distort” it. My Aja plugin has a section for this, I assume the BM plugin does too. If you don’t have a lens calibration tool, I can dig up a link to the one I found, just let me know.

If you look at my sample you can see the blueprints I use to delay the tracker data so it matches the video, I don’t have timecode yet, so I can’t help there. There is a tutorial video on my youtube site now https://www.youtube.com/user/GregCorson that explains how my sample project works.

Hi All,
I’m in looking for a solution to the below issue.

One tracker on camera and another on the actor.
In unreal real, the camera tracker has a cam mash to relate too, and the actor tracker is attached to a plan which takes the live feed. So restricting the plan’s x-axis rotation is what’s the issue.

The problem is the X-axis rotation, which needs to be restricted. I can not get that thing done.
Can you help with this.? I’m not able to get around this. As I’m not that good with coding.
Can you see if you can find anything? That helps.

Best.
Vivek.

Hi Vivek,

You shouldn’t need code for that, just a blueprint (see unreal’s tutorials on blueprints, they are good)

I am not where I can check right now, but in blueprints you can get the transform from a motion controller, right click on the transform output pin and “break” it (separates x,y,z and roll pitch yaw onto 6 output pins) then you can copy whatever data you want. To restrict x just don’t connect that pin when you transfer the motion controler position to the target object.

Hi **.Corson

Thank you for the spontaneous reply. I have been in need of this.

I will try it. And leave an output here.

Thanks again.
Vivek**