Compute world and image coordinates with exported intrinsic and extrinsic camera parameters

Hi,

For visualization of the reality capture workflow, I wanted to visualize the mapping of the e.g. a control point marked on the image into 3d space and then backward into another image. So the basic photogrammetry I class exercise.
For that I have a small dataset and exported the internal and external camera parameters for my own calculation (or for the students then later)
Unfortunately, I’m not able to manage to compute it from the given parameters.

Right now I just have exported the normal parameter set:

name,x,y,alt,heading,pitch,roll,f,px,py,k1,k2,k3,k4,t1,t2

But will add the image width and height later like described here: Post 1

I similar post like Post 2 and Post 3 but some of the images are dead or the links broken.

Can someone, maybe @OndrejTrhan himself give me a short hint of how to compute world coordinate from a pixel coordinate, with the exported camera parameters and also back. I’m especially confused by the 35mm equivalent focal length and the principle point (which seems to be an offset from the center of the image, so [imageWidth/2, ImageHeight/2]

cameraPosition = [x,y, alt; % Camera position in the world coordinate system

cameraOrientation = eul2rotm([heading,pitch,roll], ‘XYZ’); % Camera orientation as a rotation matrix

% Compute the extrinsic matrix (Rt)
R = cameraOrientation;
t = -R * cameraPosition;
Rt = [R, t]

% compute intrinsic matrix
fx = focalLength; ?? what unit
fy = focalLength; ?? what unit

K = [fx, 0, cx;
0, fy, cy;
0, 0, 1];
Kinv = inv(K);

pixelCoords = [u; v; 1]; % Homogeneous coordinates of the pixel
normalizedCoords = Kinv * pixelCoords;

worldCoords = Rt\normalizedCoords;
worldCoords = worldCoords(1:3) / worldCoords(4);

I think that the first two steps (intrinsic and extrinsic matrix calculation is faulty, but I cannot see where…

Hi schmilor, this could be helpful for you: RealityCapture XMP Camera Math

@schmilor Did you manage to compute this?