Calculating camera fustrum based of camera alignment export options

Hi Community,

I am currently working on a system to determine whether a given point inside a pointcloud is visible within the images used to make up that pointcloud with RealityCapture (RC).

RC has a variety of camera alignment export options but I am not sure what format would be best suitable for this application. In order to create a frustum for a given distance from the camera I need the vertical FOV expressed in degrees along with the aspect ratio.

I’ve tried exporting a few of the camera export options but so far couldn’t find the required variables or the means to translate the given variables into those required arguments.

Thanks for taking the time to read this post and thanks in advance for helping

 

  • Mark

Hi DiabetiMark, this function is already implemented in RealityCapture.

FOV should be mentioned in camera parameters. From exported parameters should help you Internal/External camera parameters 

Hi 374823146340,

Thanks for your swift reply! Sorry for being unclear, I know that within Reality Capture the option of displaying those images exists. However I am trying to implement something similar in a standalone viewer. 

As you mention the export options for Internal/External camera parameters does include a variety of variables but those I could not use out of the box / I don’t know how to calculate the variables in the format as described in my last post on this topic.

  • Mark

 

Hi Mark, I think, FOV is not in these parameters, but it should be in camera/lens parameters details. 

Hi 374823146340,

I’ve been looking into it for some time now but sadly haven’t found the answer as of yet. I now understand what parameters are required to calculate a frustrum volume in which I can detect if a given point of a pointcloud is within it.

 

for FOV calculation I know know that the sensor width and height, image resolution, pixel ratio, focal length and lens focal are used to determine the FOV. Which indeed can be derived from picture information.

However at this moment I am still not sure what information the last few columns of data within the Internal/External camera .csv file export are.

  • #Name: Name of the image
  • x,y,z: position within the 3D space 
  • alt,heading,pitch: translation of the camera
  • f: focal distance (distance between the lens and camera, I assume it has been calculated by RC itself)

And then some for me unknown variables:

  • px, py
  • k1,k2,k3,k4
  • t1, t2

I was hoping you or someone else could shed some light on these last few parameters. Thanks in advance!

 

  • Mark

Hi Mark,

f is a focal length, it is the distance between the lens and the image sensor when the subject is in focus. So basically the same as your description.

px, py are the coordinates of principal point offset (the value from RC / 36)

k1 - k4 are the parameters of radial distortion

t1 - t2 are parameters of tangential distortion

Together they are parameters of interior camera orientation.

Heya!

It’s been quite some time now since my last post on this thread. I’ve been trying to display the camera positions in a 3D environment (THREE.JS) but so far without any real luck. Positioning the camera’s isn’t a problem but setting the rotation still seems a bit problematic.

I assumed that the heading, pitch and roll were displayed in degrees. But currently I’m not that sure since some of the values (in the heading column for example) are negative. I would like to know in what unit the rotation is displayed in the Internal/External Camera parameters file. ^^

 

Online I read that the RealityCapture Z-axis is the up position whereas THREE.JS’s Y-axis is the up position. We set the axis accordingly. Furthermore I made sure that the model was not rotated / displaced. The camera positions are correctly placed. As THREE.JS uses radials, and I assumed the .csv  H/P/R is displayed in degrees, the degrees were converted to radials before setting (Euler) angles.

One of the last things I tried was substracting negative heading values from 360 to display the positive degrees before converting it to radians (yet again, assuming that the .csv is displayed in degrees).

 

 

In summary I would like to know:

I. In what unit is Heading / Pitch / Roll displayed in the Internal/External Camera parameters file

II. Is the rotation of a camera bound to a model it’s rotation (thus should it be appended to the model its rotation) or is it set to the general world rotation.

III. Any other suggestions of things I could try?

 

 

I’m looking forward to a response.

  • Mark

Hi Mark,

the used unit for Heading/Pitch/Roll is in degrees. For Heading it is ± 180° from the north/Y axis, for Pitch it is ± 90° and Roll should be also ± 90°.

This image is rotated by the angle -5° for example.

Rotation is bound to model coordinate system, when you project is georeferenced, then to the geo coordinate system.

How would one correctly calculate an Euler or quaternion rotation from the heading, pitch and roll?