What I have read so far in the forum, this topic has been up for discussion at some level, but still I cannot figure out what’s wrong with the coordinates in my model. So first a bit background:
I have drone pictures of an area, where I have marked out several control points on the ground (about 20 in total). The positions of the control points are all measured with a GPS, with an accuracy of 10 cm.
In Reality Capture I defined each control point (CP) in about 2-3 photographs and the type of CP is “Ground Control” and Coordinate system is defined as “Local:1 Euclidean”. The coordinate system of the whole project (also output coordinate system) is also defined as “Local:1”.
So, what I in the end would like to have is a model (point cloud and OBJ) that is aligned in the coordinate system I have measured, but Reality Capture seems to have shifted the model into its own world after aligning and running the “Normal Detail” calculation. Each CP has gained a new “Actual Position” coordinate and the whole model is extremely tilted against the horizon (which makes viewing and rotating in RC almost impossible). I also tried to export the model and the sparse point cloud with different “Export transformation settings”, but they were all coming out in the wrong position.
So please, can somebody help or explain to me why RC wants to put the model in “wrong” coordinate world, and how do I export my model so that it is in the coordinates that I have defined?