iPhone tracking for camera virtual production

Hello friend, so I have seen a lot of ppl try using a iPhone as a tracker with a live camera input such as a dslr but has anyone found solutions such as manually setting floor, Timecode lock / sync, Node offset, So it can be used for virtual production?

I’m doing the same thing and have gotten to work reasonably well.

  • Phone is an iPhone 12 Pro
  • Camera is a Blackmagic Pocket 4K
  • Virtual Camera app is LiveLink VCam
  • Timecode is tentacle sync
  • Capture card is a Blackmagic DeckLink Mini Recorder 4K

I highly recommend watching Greg Corson’s Virtual Production Project Tutorial from scratch with Unreal Engine and RETracker Bliss! - YouTube tutorial.

Although he is using Bliss Trackers, most of the steps he uses work with an iPhone, including lens calibration and world position via Aruco Tags.

The only part that doesn’t work is nodal offset using April tags. I ended up following his tutorial for manually determining nodal offset. I think there is a way to do it with just a grid or AprilTag.

As for timecode, I have tried three different approaches:

  • Use the timecode from the DeckLink card. This hasn’t worked as the timecode is way off and seems to drift. It does seem to think the timecode is Drop Frame instead of NDF, maybe. It’s also off by about a minute. I can verify that the timecode is coming in correctly to the hardware (using BM Media Express or Resolve’s capture interface). I presume this is because the hardware support is incomplete and this particular card isn’t listed as having been tested.
  • LiveLink Timecode: The LiveLink VCam app allows you to sync to a tentacle sync via Bluetooth, and the LiveLink provider works great.
  • LTCTimeCode via Audio: Hooking another tentacle sync to the computer mic port also works.

Genlock is from the Blackmagic card.

2 Likes