• Skip to primary navigation
  • Skip to main content
  • Skip to footer
  • Welcome to the world of Anagram Engineering
  • Who we are
  • Leistungen
    • Web Development
    • Embedded Systems
  • References
    • Customers
    • Webdesign & Webapps
    • Mobile Apps
    • Embedded Systems
    • Cloud Apps
  • News

Search

Anagram Engineering

Webdesign und Softwarelösungen aus Vorarlberg. Ihr Partner für innovative Lösungen rund ums Internet. Full Service Agentur.

Calibration



Next: Line Detection
Up: Results
Previous: Results


Calibration

To measure the accuracy, the results of calibration (inner and outer geometry) are used to re-project the world coordinate points to image coordinates. In an ideal model, the image points and the re-projected points would perfectly match. Unfortunately, this does not happen.

Figure 4.1:
Re-projection of the calibration pattern in MAYA. Fog has been added to see the difference between the re-projected pattern and the original image. The bottom right rect fits best. The top right is slightly translated in plus y direction. The top right rect is translated in minus x and y direction and the bottom right is translated in minus x and plus y direction.
Image maya_reproject

Figure 4.1 shows the re-projection using maya. The translation vector and the inverse rotation matrix was used to move and rotate the camera. The intrinsic parameters of the real camera and the camera used for re-projection in MAYA match. Unfortunately, distortion can not be modeled in MAYA, however the calibration pattern fits almost perfectly. The result can be used as a visualization to demonstrate the potentials but cannot be used to measure the accuracy of the calibration. The world coordinates have to be re-projected using a mathematical formulation in which lens distortion is modeled, i.e. the same model as used for calibration. The transformation that projects the world coordinates to image coordinates passes through the following stages.

$displaystyle begin{pmatrix}X_w  Y_w  Z_w end{pmatrix} rightarrow begin...
...ix}x_u  y_u end{pmatrix} rightarrow begin{pmatrix}x_d  y_d end{pmatrix}$ (4.1)


In other words, the 3D world coordinates are transformed (translated and rotated) to 3D camera coordinates. The intrinsic parameters of the camera are used to project the camera coordinates onto the 2D image plane. A 2D
$ rightarrow$ 2D transformation models the distortion affect.

To calibrate the cameras, at least two images of the calibration pattern are needed. Figure 4.2 shows ten images recorded with the left camera.

Figure 4.2:
Images of the calibration pattern, recorded with the left camera of the stereo camera system
Image alltogether_l

The patterns inside the images have different attitudes and positions in every image. This is needed, otherwise the calibration method would be unable to solve the calibration problem correctly. Once the calibration is done, the attitude of the calibration patterns, and more precisely, the translation and rotation in reference to the camera, is known. Figure 4.3 shows the reconstructed calibration patterns.

Figure 4.3:
Reconstructed calibration pattern
Image extrinsic_geometry_l

We can use this information to re-project the calibration patterns onto the images. Figure 4.4 shows the result of re-projection of the upper left corner for each image. It is a zoomed view to see the deviation. The depicted cut-out has a size of 5x5px. Calculated edge points are marked as circles and re-projected points are marked as crosses.

Figure 4.4:
Calculated edge point $ circ $ and re-projected point $ +$
Image zoom_reprojected_l

The estimator for the standard deviation of the difference between the original point and the re-projected one is called pixel error. It can be used as an estimation of the accuracy of the re-projection. Table 4.1 shows the re-projection error for every calibration image. It can be seen that the re-projection is very accurate. The calibration works with subpixel accuracy, the mean pixel error is below 0.17 pixels. The mean and median of x and y are very small, and can be further decreased by adding more calibration images.

Figure 4.5:
Re-projection error

Figure 4.5 shows the re-projection errors of the 10 calibration images for the left camera. The calibration pattern has 48 internal corners, thus 48 points are plotted in an own color for every image. All deviations from the center are below 0.6 px.

Table 4.1:
Re-projection error

Mean (x) Mean (y) Median (x) Median (y)
$ hatsigma_x$

$ hatsigma_y$

Image 1
-0.0019 0.0009102 0.0039 -0.0382 0.1118 0.2048

Image 2

-0.0011 -0.0004358 -0.0106 -0.0368 0.1197 0.1943

Image 3

-0.0007 0.000634 -0.0018 0.000143 0.1935 0.1122

Image 4

-0.00004 0.000355 0.0034 0.0041 0.1570 0.1431

Image 5

-0.00056 0.000945 0.0282 0.0265 0.2096 0.1450

Image 6

0.0012 -0.000878 0.0167 -0.0045 0.1143 0.1328

Image 7

0.00025 0.000406 -0.00061 -0.0131 0.1306 0.1125

Image 8

0.0018 0.000240 0.0052 0.0067 0.2016 0.1349

Image 9

-0.00071 -0.0015 -0.0152 0.0261 0.2121 0.1571

Image 10

0.0018 0.0011 -0.0162 0.0293 0.1429 0.1750

$ forall$

0.00000 0.00000 0.0031 0.0020 0.16248 0.15269

Equation 4.2 shows the pixel error in closed form

$displaystyle hatsigma_x = sqrt{frac{1}{N-1}sum limits_{i=1}^{N}X_i - hatmu}$ (4.2)


In order to ensure unbiasedness, note that the divisor in Equation 4.2 is $ N-1$ and not $ N$, as it would be with knowledge about the true mean $ mu$. The calculation of the estimator $ hatmu$ is similar to the calculation of the mean $ mu$.

The quality of the calibration has no influence to the quality of the line detection, which is investigated in the next section, but it has a strong influence on the correspondence analysis and the 3D reconstruction. Because the line detection works on rectified images, which are computed using the fundamental matrix $ F$ and $ F$ results of the prior calibration. If the rectification does not work properly, the attitudes of the lines are adulterated and thus correspondences may not be established. In addition, the disparity between two lines is also falsified. This has a direct influence to the 3D reconstruction, as well as other inner camera parameters have an influence to the final result (e.g. depth depends on effective focal length $ f_k$, $ X_c$ and $ Y_c$ depends on principal point). If the camera coordinates are transformed into world coordinates, the error in the extrinsic camera parameters is reflected in the world coordinates.


Next: Line Detection
Up: Results

Footer

Contact US

Stiegstrasse 24
6830 Rankweil

+43 650 925 62 64

About US

Anagram Engineering develops software for Web, Mobile, Tablet and embedded devices.

Learn More

© 2025 · Anagram Engineering

  • terms & conditions
  • impress
  • customers
  • References
  • who we are
Manage Cookie Consent
Wir benützen Cookies um unsere Website und unsere Services zu optimieren.
Funktional Always active
The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
Preferences
The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
Statistik
The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
Marketing
The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
Manage options Manage services Manage vendors Read more about these purposes
Einstellungen
{title} {title} {title}
  • Deutsch
  • English