INS-Camera Calibration Without Ground Control Points: Daniel Bender, Marek Schikora, J Urgen Sturm and Daniel Cremers
INS-Camera Calibration Without Ground Control Points: Daniel Bender, Marek Schikora, J Urgen Sturm and Daniel Cremers
INS-Camera Calibration Without Ground Control Points: Daniel Bender, Marek Schikora, J Urgen Sturm and Daniel Cremers
C. Algorithm Overview k
IV. E VALUATION
In this section, we present achieved results from flight
experiments to validate the proposed approach. The equipped Fig. 8. Visualization of the camera poses and 3D points introduced as vertices
INS is based on fibre optic gyroscopes, which have a stability in our graph optimization.
up to some hundredths of a degree per hour. In combination
with real time kinematic enhanced GPS measurements very
accurate pose information are generated. The optical system estimation of the vertical lever-arm and all other mounting
consist of a downward looking camera with 8 megapixels and internal camera parameters can be decoupled from each
and a wide angle lens with a field of view of 54 ◦ in the other. Due to these observations, we fixed the lever-arm to the
horizontal and 42 ◦ in the vertical direction. The sensors were terrestrial measurements in our optimization.
integrated into a payload pod, which was mounted beneath the
wing of a manned ultra light airplane (Fig. 1). The performed The resulting calibration parameters show a hight stability
flight course has to introduce measurements that constrain all for different flights and differs from the initial checkerboard
dimensions of the calibration parameters. For our platform calibration (Table I). This holds especially for the distortion
small movements in all axes occur even for straight and level parameters, that we omitted here for the sake of readability.
flights, which aim at a constant heading and altitude by accom- The differences between the optimized boresight angles ψB ,
plishing immediate corrections to unintentional movements. θB and φB are very close to the stated accuracy of the INS.
We performed a total of four flights within two days. At an Small variations for the intrinsic camera parameters occur
altitude of 300 meters and above, we captured two images per most likely due to different climate conditions during flight
second at a speed of approximately 125 km/h. The flights were execution. To evaluate the accuracy of the achieved results
performed as crossing straight lines (Fig. 2). To achieve a high we performed a least-square forward intersection for the pixel
image overlap, we use only images within a circle of a radius observations of five GCP. The image coordinates of these
of 600 meters around the central point. This results in a total
number of nearly 700 images for the first two flights and 300
images for the other two.
These images were used to calculate an initial 3D point
cloud of the observed area with a SFM approach under con-
sideration of the camera poses [17]. The latter were generated
by a concatenation of the INS measurements with the initial
mounting offsets determined through terrestrial measurements.
The output of the SFM (Fig. 8) was used as input for
the graph optimization. Thereby the pixel observations were
introduced as measurements with an accuracy of 1 pixel. Since
no quality log files of the INS were available we considered
the manufacturer information of an accuracy of 2 cm in the
position, 0.04 ◦ for the yaw and 0.01 ◦ for the pitch and roll
angles. Our previous work [6] reveals that the optimization
of the translational part of the mounting offsets leads to a Fig. 9. We used five ground control points to verify the accuracy of our
lower accuracy compared to usual terrestrial measurements. approach. These were placed at dominant image corners to allow easy manual
As further stated in [18], at least one GCP is needed for the measurement of their image coordinates (orange circles).
TABLE II. M EAN E UCLIDIAN DISTANCE BETWEEN FIVE GROUND
CONTROL POINTS AND THE FORWARD INTERSECTION FOR THE INITIAL
R EFERENCES
AND THE OPTIMIZED CALIBRATION ( LEFT ) AS WELL AS THE CALIBRATION [1] M. Schikora, D. Bender, W. Koch, and D. Cremers, “Multi-target multi-
FROM FLIGHT 1 FOR ALL FLIGHTS ( RIGHT ) sensor localization and tracking using passive antennas and optical
sensors on UAVs,” Proc. SPIE Security + Defence,, vol. 7833, pp. 1–9,
init. opt. gain opt. opt. flight 1 gain 2010.
[m] [m] [factor] [m] [m] [factor]
[2] M. Schikora, D. Bender, and W. Koch, “Airborne emitter tracking by
flight 1 3.17 0.53 5.98 flight 1 0.53 0.53 1.0 fusing heterogeneous bearing data,” in Proc. of the 17th International
flight 2 2.94 0.47 6.26 flight 2 0.47 0.55 0.85 Conference on Information Fusion (FUSION), 2014.
flight 3 2.02 0.37 5.46 flight 3 0.37 0.66 0.56
flight 4 4.45 0.44 10.11 flight 4 0.44 0.58 0.76 [3] L. Pinto and G. Forlani, “A single step calibration procedure for
IMU/GPS in aerial photogrammetry,” International Archives of the
Photogrammetry, Remote Sensing and Spatial Information Sciences,,
vol. 34 Part B3, pp. 210–219, 2002.
points were measured manually (Fig. 9) and used to perform [4] C. Heipke, K. Jacobsen, and H. Wegmann, “Analysis of the results
a forward intersection with the initial and optimized camera of the OEEPE test ’Integrated sensor orientation’,” Integrated sensor
poses. This leads to 3D coordinates, which were compared to orientation - Test report and workshop proceedings, OEEPE Official
Publications No., vol. 43, pp. 31–49, 2002.
values measured with a mobile GPS-receiver. The latter stated
[5] R. Kuemmerle and G. Grisetti, “g2o: A General Framework for Graph
a horizontal accuracy of about 30 cm and a vertical accuracy of Optimization,” in Proc. of the IEEE International Conference on
about 50 cm. Our results from the forward intersection are in Robotics and Automation (ICRA), 2011, pp. 3607–3613.
the same range, which shows the performance of our approach [6] D. Bender, M. Schikora, J. Sturm, and D. Cremers, “Graph-based bundle
(Table II). We assume that the larger initial error of flight adjustment for ins-camera calibration,” International Archives of the
4 occurs due to the range from 300 to 800 meters for the Photogrammetry, Remote Sensing and Spatial Information Sciences,,
altitude, which was smaller for the other flights. Nevertheless, vol. XL-1/W2, pp. 39–44, 2013.
the results using the optimized calibration parameters for flight [7] R. Tsai and R. Lenz, “A new technique for fully autonomous and effi-
cient 3D robotics hand/eye calibration,” in Proc. of the 4th International
4 are in the same range as for the other flights. The generation Symposium on Robotics Research, 1989, pp. 287–297.
of the camera poses out of calibration results from flight 1 for
[8] R. Horaud and F. Dornaika, “Hand-Eye Calibration,” International
all flights leads to a slightly decreasing performance (Table II). Journal of Robotics Research,, vol. 14, no. 3, pp. 195–210, 1995.
Given an altitude of 300 meters and above, the accuracies are [9] J. Lobo and J. Dias, “Relative Pose Calibration Between Visual and
high and clearly outperform our terrestrial calibration. Inertial Sensors,” International Journal of Robotics Research,, vol. 26,
no. 6, pp. 561–575, 2007.
V. C ONCLUSION AND F UTURE W ORK [10] F. Mirzaei and S. Roumeliotis, “A Kalman filter-based algorithm
for IMU-camera calibration: Observability analysis and performance
In this paper we presented a graph-based approach for evaluation,” IEEE Transactions on Robotics,, vol. 24, no. 5, pp. 1143–
the system calibration of a sensor suite consisting of a fixed 1156, 2008.
mounted camera and an INS. We showed how to phrase the [11] S. Weiss and M. Achtelik, “Versatile distributed pose estimation and
sensor self-calibration for an autonomous mav,” in Proc. of the IEEE
optimization problem as a graph and estimated the mounting International Conference on Robotics and Automation (ICRA), 2012,
offsets between the devices and the intrinsic camera parameters pp. 31–38.
with a graph optimization framework. Our evaluation points [12] M. Cramer, D. Stallmann, and N. Haala, “Direct georeferencing using
out that a straightforward system calibration without the usage GPS/inertial exterior orientations for photogrammetric applications,”
of GCP leads to results which show high potential for cost- International Archives of the Photogrammetry, Remote Sensing and
saving in flight calibrations. It was shown that the calibration Spatial Information Sciences,, vol. 33 Part B3, pp. 198–205, 2000.
results can be used for consecutive flights, but the highest [13] D. C. Brown, “Decentering distortion of lenses,” Photogrammetric
Engineering, vol. 32, no. 3, pp. 444–462, 1966.
precision will be obtained by performing the INS-camera
calibration during the mission. Compared to our terrestrial [14] K. Jacobsen, “Aspects of handling image orientation by direct sensor
orientation,” in Proc. of the ASPRS Annual Convention, 2001.
calibration, we achieved in our experiments an improvement of
[15] R. Hartley and A. Zisserman, Multiple View Geometry in Computer
roughly factor six, which can be even higher for other setups Vision. Cambridge University Press, 2004.
due to larger angle misalignments between the devices. [16] M. Lourakis and A. Argyros, “SBA: A software package for generic
Future work will investigate the proposed procedure in sparse bundle adjustment,” ACM Transactions on Mathematical Soft-
ware, vol. 36, no. 1, pp. 1–30, 2009.
more detail. Furthermore, we will perform experiments with
[17] C. Wu, “Towards linear-time incremental structure from motion,” in
cost-efficient MEMS INS and evaluate if the approach is International Conference on 3D Vision, 2013, pp. 127–134.
also usable to perform the INS-camera calibration if no RTK [18] A. Kersting, A. Habib, and K. Bang, “Mounting Parameters Calibra-
corrections are performed. tion of GPS/INS-Assisted Photogrammetric Systems,” in International
Workshop on Multi-Platform/Multi-Sensor Remote Sensing and Map-
ping (M2RSM), 2011, pp. 1–6.