First off, a lot of people have been asking how to migrate my virtual production template over into another UE project. It turns out this wasn’t easy to do (even for me!) so I spent some time working out what was going wrong and now it’s very easy. If you grab the latest project from GitHub - MiloMindbender/UE4VirtualProduction: An example Unreal Engine Virtual Production Project< you can look at Readme_2 for a description of how to move it. It took some time to figure this out, but now you can migrate everything to a new project in a couple of minutes.
The bad news is they have to do some construction in my building so my studio will probably be down for about a week. The good news is I’m moving to a new location that has a proper light grid in the ceiling, so I should be able to hook up the Optitrack and see how that works.
@LFedit It will probably take me 1-2 weeks at least to get the Optitrack setup because of the studio move, when it is working, I’ll put up some video and project updates for it. Need to see if I can get a 4.24.1 version of the optitrack plugin to test too.
@Tricky_3D Grab my project and take a look at “CompCameraRig” in it. Under “details” the SceneComponent location/rotation is the position of the camera (in my case, relative to the “talentmarkerseparate”) if you look at the CameraComponent, it’s location/rotation is the offset from the tracker to the camera film plane. You said you had seen the picture of my camera rig, for that rig the offset is about -9,0,-11.7 with a Y rotation of -90. There is also a “cone” mesh that represents the camera view…the location/rotation of this had to be set also to get the point of the cone to be (approximately) where the camera lens is. If you are wondering where my motion controler is, it is inside of “talentmarkerseparate” which contains blueprint code that delays and copies the position of the tracker over to CompCameraRig.
Once you set the offsets right, the method you describe for making a pawn SHOULD work. To make debugging easier I suggest you go into your motion controller and check “display device model” so you can see the tracker inside unreal. However be aware the origin of the tracker model in unreal (last time I checked anyway) is not exactly right. It seems to be in the center of the model and not at the base of the tripod screw. This doesn’t effect what offsets you use, these should still be based on real world measurements (in cm). Just be aware that even after you get the offsets right, the model of the tracker may not sit exactly on the origin of that cube.
One other thing to watch out for if you are using COMPOSURE. The process you describe will make a tracked camera that works, but composure will not recognize it. Composure will not recognize a camera component down inside another actor, it only recognizes a camera actor. You need to create a camera actor, then copy the location/rotation into it like I do, then composure will see the camera (drove me crazy for awhile figuring this out!) I’m trying to figure out a better way to do this, but for now the way it’s done in my project works.
Hope this helps!
I encourage everyone to post their experiences publicly! I’ve found a lot of weird little things that drove me crazy for awhile till I figured them out, so posting your fails/successes will help other people and help Epic see where improvements are needed.
Greg