Next Article in Journal
An Evaluation and Comparison of Four Dense Time Series Change Detection Methods Using Simulated Data
Next Article in Special Issue
Measuring Change Using Quantitative Differencing of Repeat Structure-From-Motion Photogrammetry: The Effect of Storms on Coastal Boulder Deposits
Previous Article in Journal
Near Real-Time Characterization of Spatio-Temporal Precursory Evolution of a Rockslide from Radar Data: Integrating Statistical and Machine Learning with Dynamics of Granular Failure
Previous Article in Special Issue
Relative Importance of Binocular Disparity and Motion Parallax for Depth Estimation: A Computer Vision Approach
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Accurate Calibration Scheme for a Multi-Camera Mobile Mapping System

by
Ehsan Khoramshahi
1,2,*,
Mariana Batista Campos
1,3,
Antonio Maria Garcia Tommaselli
3,
Niko Vilijanen
1,4,
Teemu Mielonen
5,
Harri Kaartinen
1,6,
Antero Kukko
1,4 and
Eija Honkavaara
1
1
Department of Remote Sensing and Photogrammetry of the Finnish Geospatial Research Institute FGI, Geodeetinrinne 2, FI-02430 Masala, Finland
2
Department of Computer Science, University of Helsinki, FI-00014 Helsinki, Finland
3
Cartographic Department, School of Technology and Sciences, São Paulo State University (UNESP), São Paulo 19060-900, Brazil
4
Department of Built Environment, School of Engineering, Aalto University, 02150 Espoo, Finland
5
National Land Survey of Finland, Opastinsilta 12C, 00521 Helsinki, Finland
6
Department of Geography and Geology, University of Turku, 20014 Turku, Finland
*
Author to whom correspondence should be addressed.
Remote Sens. 2019, 11(23), 2778; https://0-doi-org.brum.beds.ac.uk/10.3390/rs11232778
Submission received: 16 October 2019 / Revised: 18 November 2019 / Accepted: 21 November 2019 / Published: 25 November 2019

Abstract

:
Mobile mapping systems (MMS) are increasingly used for many photogrammetric and computer vision applications, especially encouraged by the fast and accurate geospatial data generation. The accuracy of point position in an MMS is mainly dependent on the quality of calibration, accuracy of sensor synchronization, accuracy of georeferencing and stability of geometric configuration of space intersections. In this study, we focus on multi-camera calibration (interior and relative orientation parameter estimation) and MMS calibration (mounting parameter estimation). The objective of this study was to develop a practical scheme for rigorous and accurate system calibration of a photogrammetric mapping station equipped with a multi-projective camera (MPC) and a global navigation satellite system (GNSS) and inertial measurement unit (IMU) for direct georeferencing. The proposed technique is comprised of two steps. Firstly, interior orientation parameters of each individual camera in an MPC and the relative orientation parameters of each cameras of the MPC with respect to the first camera are estimated. In the second step the offset and misalignment between MPC and GNSS/IMU are estimated. The global accuracy of the proposed method was assessed using independent check points. A correspondence map for a panorama is introduced that provides metric information. Our results highlight that the proposed calibration scheme reaches centimeter-level global accuracy for 3D point positioning. This level of global accuracy demonstrates the feasibility of the proposed technique and has the potential to fit accurate mapping purposes.

Graphical Abstract

1. Introduction

Mobile mapping system (MMS) is a photogrammetric mapping agent that is usually defined as a set of navigation (global navigation satellite system (GNSS) and inertial measurement unit (IMU)) and remote sensors—such as cameras, lidar, and odometer sensors—integrated in a common moving platform [1]. The importance of an MMS has been widely highlighted based on its cost-effectiveness, high data-capturing rate, and acceptable level of accuracy [2]. System calibration is an indispensable part of the process of employing an MMS. Regarding an MMS that is based on multi-cameras and navigational systems, two calibration process could be mentioned: multi-camera calibration and MMS calibration. Multi-camera calibration aims to estimate interior and relative orientations of a multi-camera. MMS calibrations refers to estimating relative position and orientation between a multi-camera and GNSS/IMU sensors. Accurate system calibration ensures high-quality outputs for at-least a minimum period that a mapping system stays relatively still; a periodic system calibration scheme is able to guarantee the correctness of time-dependent parameters.
The first aspect of employing an optical-based MMS relates to the problem of multi-camera calibration [2,3,4,5,6,7,8,9,10,11]. Nowadays, many MMSs are equipped with multi-projective cameras (MPC) because of their sturdy design, large field of view (FOV), and promising sensor models. Sensor modeling of an MPC usually consists of interior orientation parameters (IOP) of individual cameras, relative orientation parameters (ROP) between cameras with respect to a reference camera, and a scale factor that connects ROPs to a global framework. An important aspect of integrating a camera system in an MMS relates to the problem of employing a rigorous sensor model that fits to the physics of the camera. The sensor model maps a 3D object point into its corresponding image point [12]. The next step is to employ the sensor model as the core of a statistical optimization model to find optimum values and express our uncertainties about the unknowns such as camera parameters or 3D position of object points.
Many MMSs use multiple cameras but do not necessarily generate panoramas. A panorama is a continuous presentation of an environment which is demonstrated by one photo or a series of photos that are merged together by a ‘stitching’ or a ‘geometric non-stitching’ approach. This form of photography provides viewers with an unlimited viewing possibility in all directions [12]. Because of the all-angle viewing property, panoramic photography has found a wide range of applications; few of those applications are computer vision, robotics, surveillance, virtual reality, indoor/outdoor photography, and historical heritage documentation.
Recently developed photogrammetric models for MPCs has brought a new perspective in panoramic images applications. If these cameras, which are initially designed for all-direction photography, treated with photogrammetric models, numerous potential applications will emerge for surveying and mapping purposes.
A categorization of panoramic cameras can help us to find shared properties and mathematical model for members of each class. Amiri Parian and Gruen [12] categorized panoramic imaging into four groups: stitched, mirror-based rotating-head, near 180, and scanning 360 panoramas. Stitched panoramas are mainly used for non-metric applications where accurate directions are not important; therefore, this class is outside the focus of a metric categorization. We may re-categorize most of commercially available panoramic cameras that are suitable for surveying tasks according to their internal capturing technology into four groups [13]: (1) rotating-head, (2) multi-fisheye and (3) multi-projective camera, and (4) catadioptric systems. The benefits of the latter categorization are twofold. Most of commercially available cameras that are useful in metric photogrammetry tasks fit into this categorization; moreover, it simplifies the sensor modeling since it has founded based on a similarity measure that places cameras to a category according to their common sensor model.
The category (1) rotating head cameras contains cameras based on linear array CCDs that is mounted on a vertically rotating head. Examples of this class include EYESCAN M3 that is jointly developed by DLR and KST [14] and SpheroCam which is manufactured by SpheronVR AG. Equivalently, a projective camera such as a Canon EOS 6D could be mounted on a motorized rotating head such as Phototechnik AG, Roundshot metric, piXplorer 500, or GigaPan EPIC pro. The camera and step motors are synchronously controlled by a central processing Unit (CPU) that plans the movements and takes image shots. If the system has been accurately calibrated (optical center of the projective camera’s lens should be precisely located on the center of rotation), high-precision metric information and high-resolution panoramic images can be obtained with least effort [15]. Depending on the target applications, one or two step motors are working simultaneously to rotate the central camera around a point/axe. One main configurable parameter in building this class is the option to choose between one or two rotating axes; a post-processing module is responsible to transfer and merge images that have been captured with different orientations. Subsequently, a color blending approach is required to ensure that color tunes between stitched images are consistent. Usually a full resolution shot takes few seconds to even few minutes to compile which makes this class of panoramic cameras relatively unsuitable for mobile-mapping. Some important use-case of this class is cultural heritage recording and classification [16,17], high-precision surveying [12,18,19,20,21], industrial-level visualization, indoor visualization, 3D modeling, and artistic photography.
The category (2) multi-fisheye cameras comprises a structure of a dual spherical camera module mounted on a rigid frame to construct a light-weight mobile 360 imaging system; Few consumer-grade samples of this class are Samsung Gear 360 [22], SVPro VR 360, MoFast 720, MGCOOL cam 360, SP6000 720, Ricoh Theta S [7]. Some recent works suggests multi-camera system based on a dual-fisheye design [23,24]. Fisheye camera models for photogrammetric applications were extensively studied and tested [25,26,27,28,29]. The simultaneous geometric calibration of multi-camera system based on fisheye images, aiming a 360° field of view also started to be discussed recently. For instance, a dual fisheye calibration model is proposed by [30] and [7]. For this class of cameras, a customized statistical optimization process that involves using weighted observations and initial distributions of unknowns has proved sufficiently accurate for low to medium-accuracy surveying applications. However, some limitation of these systems can be mentioned, such as non-perspective inner geometry, huge scale and illumination variations between scenes, large radial distortion and nonuniformity of spatial resolution. Therefore, the overall image quality of panoramic images that are produced by the cameras of this class are usually worse than the first or third categories, however, some desired aspects—such as data-capturing rate or simplicity—makes them ideal candidates for applications such as low accuracy mobile mapping or 3D virtualization [31,32,33,34,35].
The category (3) multi-projective cameras contains panoramic cameras with arbitrary number of projective cameras mounted fix on a rigid frame (an industrial examples is LadyBug with six cameras [v.3 and v.5; M P C 6 ] by FLIR; a consumer-grade example is Panono 360 with 36 cameras [ M P C 36 ]). Cameras of this class (multi-projective cameras or in short MPC) are usually customized for certain imaging, navigation, or surveying tasks. A synchronous shutter mechanism is applied to take simultaneous shots (<1 msec delay). A geometric model for MPC integrated into a statistical adjustment model is proposed by many researchers, e.g., ([9,10,11]). This model ensures desirable geometric accuracies for many tasks such as 3D mapping and surveying, 3D visualization, and texturing. Panoramas that are initially generated for this class are based on stitching techniques that mainly have visualization and artistic values; it is in contrast to the geometric values of panoramic images that are taken from cameras of the first or the second categories; this class of cameras has rigorously found many applications such as surveying, robotics, visualization, cinema, and artistic photography. Similar to the second class, it has the potential to be employed in applications that needs fast data-capturing rates such as mobile mapping, or navigation.
Early attempts to employ relative orientation constraints among multiple cameras was applied to a stereo camera, e.g., ([3,36,37]). He et al. [36] developed an MMS with a stereo camera and a GPS receiver to measure global coordinates of any point through photogrammetric intersection. King [3] modified the conventional bundle block adjustment (BBA) to accept relative orientation stability constraints. Zhuang [37] employed a fixed-length moving target to calibrate a stereo camera. Later, more complex systems of cameras went under research investigations. For example, Svoboda et al. [38] calibrated an M P C 4 that were installed on a visual room. Lerma et al. [39] calibrated a M P C 3 that consisted of a stereo camera and a thermal camera by employing distance constraints. Habib et al. [40] analyzed variations in IOP/Exterior Orientation Parameter (EOP) of multi-cameras. Detchev et al. [41] presented a system calibration scheme by employing system stability analysis. Some researchers employed a calibration field for multi-camera calibration. For example, Tommaselli et al. [5] employed AURUCO coded targets [42] to design a terrestrial calibration field. They used their proposed photogrammetric field to calibrate fisheye, catadioptric, and multi-cameras. Khoramshahi and Hokovaara [10] employed a customized coded target (CT) to create a calibration room. They employed it to calibrate a complex M P C 36 (Panono), and an M P C 3 (LadyBug v.3). In this work, we follow the calibration model that was proposed by Khoramshahi and Honkavaara [10].
Category (4) contains catadioptric systems that employ complex optical systems. A set of spherical and aspherical lenses, shaped mirrors such as parabolic, hyperbolic, or elliptical mirror [43], and refractors are employed in catadioptric systems to cover a large FOV. An example of this class was proposed by [23] as a prototype dual catadioptric camera. A calibration model for a catadioptric camera consists of a wide-lens camera and a conic mirror was proposed by [44].
The second aspect of employing an MMS relates to the problem of direct geo-referencing. Using multi cameras of third category for mapping applications require accurate determination of orientation and position of each panorama. This orientation can be done by indirect or direct georeferencing techniques, or combining both. Direct geo-referencing is a very useful approach for real-time application such as mobile mapping, or aerial surveying. Direct geo-referencing is the process to find position and orientation of captured images (EOPs) in a global reference frame without employing any ground control point (GCP), which requires the integration of additional sensors such as GNSS receivers and IMU sensors into the camera’s mounted frame. This integration either provides initial values for positions and orientations of the camera shots as weighted observations in the BBA, or helps the system to instantly estimate position vectors and Euler angles of shots [45]. It is worth noting that a successful direct geo-referencing requires two main aspects. Firstly, GNSS and IMU sensors should be accurately synchronized to an MPC. Secondly, a robust estimation of lever-arm vector and boresight angles should be known. These later parameters are known as mounting parameters [46]. Rough estimation of the MMS calibration parameters could be performed by comparing observations of GNSS and IMU with the output of BBA. To enable a direct geo-referencing, a customized sensor model is required to be employed by considering additional parameters of lever-arm and boresight misalignments of an MPC with respect to GNSS and IMU sensors.
A variety of calibration schemes has been overly discussed in the literature; however, as far as the authors concerned, a rigorous, practical, and easy solution for direct georeferencing of an MPC does not exist that completely fit to the configuration of a multi-camera MMS. Moreover, stitching operation is required to compile a panorama from images of the third category cameras. This panoramic presentation usually has just artistic or visual values. A non-stitching panoramic creation scheme is also feasible for this class that adds geometric value to generated panoramas. This paper contributes to a rigorous, easy, and practical scheme for calibrating a multi-camera MMS equipped with an MPC, GNSS and IMU sensors in a terrestrial vehicle. We also present a novel non-stitching algorithm by introducing a panoramic correspondence map. We propose a modified BBA that comprises MPC calibration and MMS calibration. We assess quantitatively the quality of the proposed approach by checking the accuracy of the point intersection by independent checkpoints. Our results demonstrate that sub-decimeter level accuracy is achievable by this technique.

2. System Calibration

Mathematical theory behind the implementation of this work is discussed in this part. We first start with sensor model of an MPC, then we continue toward discussing a complete sensor model for the underlying MMS. Finally, a modified BBA is discussed to employ the sensor model and statistically express uncertainties about parameters.

2.1. Individual Camera Model

The following interior orientation model removes main distortions from image coordinates and outputs undistorted pinhole image coordinates as
x 1 = ( x ) t 1 PP x f ,   y 1 = ( y ) t 1 PP y f , r 2 = x 1 2 + y 1 2 ,   Rad = ( 1 + K 1 · r 2 + K 2 · r 4 + K 3 · r 6 ) , ( x n ) t 1 = x 1 · Rad + 2 · P 1 · x 1 · y 1 + P 2 · ( r 2 + 2 ( x 1 ) 2 ) δ · x 1 + λ · y 1 ( y n ) t 1 = y 1 · Rad + 2 · P 2 · x 1 · y 1 + P 1 · ( r 2 + 2 ( y 1 ) 2 ) + λ · x 1 ,
where ( x ) t 1 and ( y ) t 1 are distorted coordinates of a point in pixel unit,   PP x and PP y are coordinates of the principal point, K i are radial distortion coefficients, P i are tangential distortion parameters, and δ and λ are scale and shear factor respectively. It is trivial to show that the underlying model (Equation (1)) is exchangeable to Brown’s model [47,48].
An image of a projective camera is linked to the camera’s sensor model, and relates to six orientation and position parameters that uniquely determines it inside a 3D Euclidean space. Therefore, a linear relationship between an object point and its undistorted pinhole coordinates is established by the collinearity model
x 1 = M 1 · ( X X 0 ) M 3 · ( X X 0 ) ,   y 1 = M 2 · ( X X 0 ) M 3 · ( X X 0 ) ,
where M i is the ith row of the rotation matrix M 3 × 3 , X is a vector of (3 × 1) of the object point’s coordinates, and X 0 is a (3 × 1) vector of location of image in a 3D cartesian space (bold letters are used for vectors and matrices and normal letters are used for scalers throughout this paper).

2.2. Multi-Projective Camera Model

An MPC is presented in this work by the following calibration parameters:
  • I O P i ( i = 1 : n ) ( f ,   PP x ,   PP y ,   K 1 , K 2 , K 3 , P 1 , P 2 , δ , λ   ),
  • R O P i ( i = 2 : n )   { ( ζ , η , ψ ) ,( Δ x , Δ y   , Δ z )},
  • Λ M P C ,
where I O P i is interior orientation parameters of an individual camera of an MPC, R O P i is relative orientation of an individual camera with respect to the first camera, ( ζ , η , ψ ) are Euler angles, ( Δ x , Δ y   , Δ z ) are displacements, and Λ M P C is the unknown scale factor. The last parameter ( Λ M P C ) plays a role if scale of an MPC sensor would be different from scale of the model. It could be ignored if both scales are equal. In overall, there will be a number of nCalib . = n × 10 ( I O P ) + ( n 1 ) × 6 ( R O P ) + 1 ( s c a l e ) = 16 × n 5 calibration parameters for an MPC with (n) projective camera; a multi projective image (MPI) is geometrically defines by its corresponding MPC sensor and six EOPs.

2.3. Space Resection of Multi-Projective Images

The goal of the space resection is to find approximate locations of a given MPI with respect to a 3D frame under the conditions that at-least (3) image-object correspondences exist and localized in a 3D Cartesian framework. Space resection helps to orient a set of MPIs with respect to a given point set (e.g., a calibration room). The orientations are finally adjusted through a BBA.
An alternative way to resect an MPI is to employ the coplanarity equation. Coplanarity condition connects two normalized pinhole coordinates and image centers through a matrix called Essential, which is a (3 × 3) matrix that is composed of three rotations and three translations as
E = [ ( X 0 ) t ] x · R t ,
where [ ( X 0 ) t 2 ] x is the matrix presentation of a cross product which is a 3 × 3 matrix of the form
( 0 ( X 0 ) t 2 3 ( X 0 ) t 2 2 ( X 0 ) t 2 3 0 ( X 0 ) t 2 1 ( X 0 ) t 2 2 ( X 0 ) t 2 1 0 )
Essential matrix is estimated by the following equations
( x = t 1 ) T · E ¯ ( t 1 , t 2 ) · ( x = t 2 ) = 0 ,
where x ¯ t 1 is normal image point at time ( t 1 ), and x = t 2 is the calculated normal location of the same 3D point in a synthesized image. Space resection of an MPI is performed for each of its projective images. At least 5–8 tie points are required based on the selected method (5-point [49], 6-point, 8-point, or normalized 8-point [50]). Then a synthesized pair is formed. The synthesized image is localized such that the resulted stereo pair becomes stable. Next, the retrieved position is translated into approximate position of the parent MPI ( X 0 i ,   w p k i ); finally, all positions are averaged to estimate the position and orientation of the MPI
X 0 = i X 0 i ,   w p k = i w p k i .

2.4. Relative Orientation between Two Multi-Projective Images

Finding ROPs between two MPIs is essential to independently construct a scene. ROPs between two MPIs are expressed by six parameters out of which five are independent. Usually the largest element of a stereo pair’s baseline is scaled to one. Three rotations and two displacements are finally solved by relative orientation module. To estimate ROPs between two MPIs, a similar averaging approach to the space resection has been chosen. Therefore, all image pairs between MPIs are oriented. Then the resultant ROPs are transformed to the parent MPI and averaged.
In the process of MMS calibration, initial values of lever-arm and boresight misalignments are estimated by independently compiling a local network of few shots, then connecting the local network to the global frame of GNSS/IMU sensors.

2.5. Bundle Block Adjustment

BBA employs non-linear least-square adjustment method to minimize a cost function based on residuals of observational equations. It finally expresses the uncertainties about the unknowns of an MPC calibration as a covariance matrix. The MPC cost function is a combination of all observational equations. It is expressed as the vector function
F ( ( f ,   P P ,   K , P , δ , λ   ) i = 1 : n , R ( ζ , η , ψ ) i = 2 : n ,   Δ i = 2 : n , R ( ω , ϕ , κ ) t = t 1 : t m , ( X 0 ) t = t 1 : t m ,   X ( 1 : nO ) , x ( 1 : nI ) ,   l a , b s ) = 0 ,
where, ( f ,   P P ,   K , P , δ , λ   ) i = 1 : n are IOPs for projective cameras (1:n), R ( ζ , η , ψ ) i = 2 : n ,   Δ i = 2 : n are structural parameters regarding the M P C , R ( ω , ϕ , κ ) t = t 1 : t m , ( X 0 ) t = t 1 : t m are orientations and locations of image shots, X ( 1 : nO ) , x ( 1 : nI ) are position of object points and the corresponding image points, l a and b s are lever-arm vector and boresight misalignments respectively. To model the entire system, equations for image observations, GNSS/IMU-observations, and GCPs are introduced.
Two observational equations are considered in the MPC’s BBA for each image tie points, as
x i j · ( M 3 ) t 1 j   ( X i ( X 0 ) t 1 j   ) + ( M 1 ) t 1 j ( X i ( X 0 ) t 1 j ) = 0 ,
y i j · ( M 3 ) t 1 j ( X i ( X 0 ) t 1 j ) + ( M 2 ) t 1 j ( X i ( X 0 ) t 1 j ) = 0 .
In Equations (8) and (9), ( x i j , y i j ) are pinhole coordinates of object point ( i ) inside image ( j ) (pinhole camera, f = 1), ( M i ) t 1 j is the ( i t h ) row of the rotation matrix M t 1 j   ( j t h projective camera [ j : 1 n ] at time t 1 ), ( X 0 ) t 1 j is location of j t h projective camera at time t 1 , and X i is 3D location of the corresponding object point ( i ).
In order to combine observations of GNSS and IMU sensors to the BBA, a link between GNSS, IMU and the MPC is established (Figure 1). A number of (6) additional parameters are introduced to represent a shift (lever-arm), and orientations (boresight misalignments). The connection is formulated by six non-linear equations (Equations (10) and (11)). The observational equation for lever- arm and boresight misalignments are
( ϕ λ κ ) ( I t )   ( ( R C ) I · ( R C t ) ) = 0 ,
( X 0 I t ) ( X C t ) + ( R C t ) · ( R C ) I T · ( X C ) I = 0 ,  
where ( ϕ λ κ ) ( I t )   are ‘observed’ Euler angles of the INS at time ( I t ), ( R C ) I is the rotation matrix of an MPC with respect to the INS local coordinate system, and R C t is rotation matrix of the MPC at time ( t ). In Equation (11), X 0 I t is observed position of INS at time ( t ), X C t is unknown position of the MPC at time ( t ), and ( X C ) I is relative position of the camera with respect to the INS in the local coordinate system of the INS. Here, indicates the function that maps a rotation matrix into its corresponding Euler angles.
GCPs are contributed to the BBA by three observational equations
X g X g o = 0 ,
where X g is unknown 3D position of a GCP, and X g o is observed position.

2.6. Panoramic Compilation

Creating a panorama for an MPC without image stitching is possible under the condition that the MPC is calibrated (known IOPs and ROPs). Therefore, calibration data is sufficient to compile a non-stitching panorama. The algorithm for building a panorama contains looping over a hypothetical sphere or cylinder (optional projection system) that covers an MPI, then for each pixel in the final panoramic compilation the algorithm finds a corresponding camera index (0-n) and an image location (pix). For a better rendering result, an appropriate interpolation technique (bilinear or bicubic) could be considered since the projection from the sphere (or cylinder) to projective images results in subpixel positions. If the correspondence data is saved as a meta-data file (it may be called panoramic correspondence map), then the final panorama will have geometric values. It is because of the fact that collinearity condition between an object point, its corresponding image point, and perspective center of a projective camera of an MPC is established. A second usage of the correspondence map is efficiently compiling a panorama, since indexing is expected to be much quicker than direct calculation.
There are regions in the correspondence map that more than one choice of correct image is possible for every point in the final panoramic compilation. It is because of the existence of overlaps between adjacent cameras. A criterion needs to be considered to build a unique map, for example minimum incident angle could be used.
Depending on the distance of a 3D object points to a panorama, discontinuity on edges is expected to occur; it is because the projection center of different individual cameras does not match to the center of the hypothetical sphere. Discontinuity is expected to be more severe for closer objects. Finally, an important post-processing is the use of color blending on edges. It improves the quality of a compiled panorama. Without color blending, sharp steps are expected to be observed on the edges that pixel labels start to switch from one camera to another. Color blending softens those steps and improves the color consistency of the final panorama.

3. Methods and Material

3.1. Mobile Mapping System

The MMS in this works comprises an MPC, a GNSS, and an IMU that are firmly installed on a car’s roof. This section describes the experiments to assess the proposed method described in Section 2. The proposed method was implemented in C++ and MATLAB languages and evaluated using a real dataset from an MMS composed by a multi-camera LadyBug 5, and a navigation system installed on a rigid aluminum truss structure on top of a Skoda Superb (Figure 2).
Ladybug V.5 is an MPC with five side-looking cameras and one upward-looking camera. Each individual camera contains a sensor of size 2464 × 2048 pix. Its focal length is 18 mm with sensor size of (35.8 × 29.7 mm) with field of view of 89.3 ° ±   81 . Panoramas cover an area of 360 ° horizontally by 145 ° vertically. Resolution of output panoramas is approximately 8k (7200 × 3600 pixels, aspect ratio 2:1).
The GNSS receiver used was a NovAtel PwrPak7, which together with IMU compiles a Synchronized Position Attitude Navigation (SPAN) system. PwrPak7 logged GPS and Glonass observations via NovAtel VEXXIS GNSS-850 satellite antenna. Sensor trajectory was computed by combining GNSS and IMU data with Waypoint Inertial Explorer software. Virtual GNSS base station reference data was acquired from commercial Trimnet service. The maximum distance from trajectory (Figure 3) to the base station was ~350 meters. The 3D RMSE for the GPS positions was 1.6 centimeters.
A NovAtel IMU-ISA-100C, manufactured by Northrop-Grumman Litef GMBH, was paired with a NovAtel SPAN GNSS receiver. The IMU-ISA-100C is a near navigation grade sensor containing GNSS-850 satellite antenna (in front of IMU) mounted on a truss structure on top of a Skoda passenger car. NovAtel PwrPak7 GNSS receiver together with operating and logging laptop were inside the car.

3.2. MPC Calibration

FGI’s calibration room is an empty space with dimension of 519 × 189 × 356 cm3 that is equipped with 215 CTs (Figure 4). 3D positions of all targets are a-priori known based on previous observations. Images taken with a Canon EOS 6D camera (sensor 20 Megapixel, image size 5472 × 3648 pixels , lens Canon EF 24 mm, focal length 20.6   mm ) were used to accurately calculate 3D coordinates of the targets [10] in the calibration room with bundle adjustment, before MPC calibration.
In the first step, FGI’s camera-calibration room was employed to calibrate the MPC. To do so, 3D positions of the CTs had been accurately estimated by employing a Canon EOS-6D camera achieving a relative 3D positioning precision of std < 1 mm. The automatic target extraction included:
  • Employing adaptive thresholding using first-order statistics to roughly estimate the position of blobs (accuracy of better than 2pixel);
  • Accurately fitting rotated ellipses to extracted blobs by least-square fitting (accuracy   0.1 pixel),
  • Clustering blobs;
  • Finding CTs from extracted clusters by employing structural signature of the CTs, and
  • Reading ID for each target.
The automatic CT detection was implemented in MATLAB. The standard deviation for image-point observations was set as 0.1 pixel.
The estimated focal length of the Canon was 3156.10 pixels with a standard deviation of 0.04 pixels. The principal point was estimated as (2752.37, 1509.36) pixels with standard deviation of 0.05 pixels. Regarding each of the IOPs, a significance analysis is performed assessing the amount of distortion that is corrected by freezing all the IOPs except for an underlying parameter (or set of parameters) on a grid of 10 × 10 pixels. The largest distortion on the grid was chosen as significance value for a given IOP. The significance values for radial distortions (set), tangential distortion (P1, P2, and P3), scale, and shear were 69.2, 2.4, 1.0, 0.03, and 0.06 pixel respectively. The estimated standard deviations of the CT points were 0.15, 0.03, and 0.02 mm in principal directions of 3D error ellipsoids respectively, which indicates that our target accuracy was achieved.
In the second step, the same CTs were automatically extracted in individual images of the MPC (LadyBug 5). Next, each MPI was resected (Section 2.3) using the calibration-room’s CTs, and finally, the BBA was performed to optimize the sensor parameters of the MPC. In this processing, the CTs coordinates based on the measurement by the Canon were kept fixed.
Two datasets of 26 and 53 MPIs were captured from the FGI’s calibration room to calibrate the MPC. In the first dataset, the camera is attached to a horizontal slide, therefore each few images are coplanar and at the same height level. In the second dataset, the camera is put on a tripod with two different height levels. In total, 474 individual projective images have been captured; then, CTs were automatically extracted in all individual images.

3.3. MMS Calibration

The MMS calibration was carried out using outdoor dataset from Inkoo, which is a municipality in southern Finland. A set of 8424 MPIs were captured by the LadyBug 5 camera for Inkoo harbour area and two close-by regions. Most roads have been captured by the MMS in both directions.
The GCPs were measured using the Topcon Hiper HR RTK dual band receiver with an accuracy within 5 mm + 0.5 ppm horizontally and 10 mm + 0.8 ppm vertically [1. source:TOPCON]. Nearest base station was located in Metsähovi 28 km away from Inkoo, therefore accuracy is vertically 19.0 mm and horizontally 32.4 mm, respectively.
The MMS calibration involved estimating relative parameters of the MPC with respect to the GNSS and IMU (lever-arm and boresight miss-alignments). Observational Equations (10) and (11) were added to BBA. Analytical partial derivatives of latter equations were considered for calculating the Jacobian matrix. The calibrated MPC sensor was used to construct an outdoor case. The sensor was kept fixed during the scene reconstruction except for its scale parameter. A number of six MPIs were connected together by the relative orientation module (Section 2.4) to construct a local network (initial network). Model coordinates of all 3D points were accurately determined with corresponding sub-pixel re-projection error and added as observational equations. Few GCPs were observed in individual images and added to the model to establish a link to the global coordinate system (Equation (6)). Then, GNSS and IMU observations were added to the initial network. Lever-arm and boresight was initialized from the initial network and finally optimized through the BBA. The MPC sensor kept fixed during MMS calibration to avoid a singularity in the Jacobian matrix. In the optimization process, locations, and orientations of MPIs, 3D positions of GCPs, and six parameters of lever-arm and boresight were set free.

3.4. Performance Assessment

In order to assess the accuracy of the MMS calibration and georeferencing, seven check sites approximately covering the area were selected (Figure 3). In each site, the GNSS and IMU readings of the MMS were translated into MPI orientations that were converted into individual image’s orientation and rotation by using MPC sensor calibration information from Step 1 (Section 3.2). Then image positions of check points were observed in individual images; consequently, two sets of coordinates were available for each cross-check site: 1—intersecting coordinates from image measurements and collinearity equations; and 2—observed positions of check-points from GPS. The difference was recorded along with the maximum intersecting angle of each 3D check point as a quality-control indicator. It was expected that weaker interesting geometry would result higher object-space residuals.

4. Results

4.1. MPC Calibration

Indoor calibration of the MPC resulted in sub-pixel individual image accuracies (mean image residual 0.4 pixel, std. 0.45 pixel). Moreover, internal structure of the final optimized MPC resembled very well to the physical reality of the underlying MPC. Different height levels of MPIs resembled well to the reality of the capturing configuration and played a role as a visual cross-checking. Undistorted individual image, as depicted in Figure 5 had very large radial distortions. The MPC’s significance matrix also showed a considerable value of 2183 pixel for radial distortion for camera number (1) that was caused by the lens structure. Significance values for tangential distortions were relatively small (<10 pixel). The calibrated IO parameters of individual cameras with their corresponding standard deviation are given in Table 1. Most of the standard deviation values of camera 6 were higher than average. One reason could be the weaker distribution of CTs on the ceiling in comparison to the walls and the floors distributions. Scale and shear factors appeared meaningful in the BBA since their optimized values were relatively bigger than the standard deviation values. Table 2 demonstrates the calibrated structure of the LadyBug camera. The order of standard deviation values demonstrated an acceptable level of precision in the calibration process. In this table, the values related to the first camera demonstrates a priori distribution of the camera 1’s structural parameters. In this table, standard deviation values that are bigger than average are highlighted with a gray color.
Figure 6 demonstrates absolute values of normalized cross-correlations between all free parameters (IOPs and ROPs (structure)) of the camera 2 (indoor calibration). In this figure off-diagonal elements are of high importance and need to be carefully treated in the adjustment to avoid singularity. Main off-diagonal elements are marked by red rectangles. In this figure, radial distortions are the first group of highly correlated parameters. This high correlation implies that the collective effect of the group is important, therefore investigating an individual parameter in this group is irrelevant. The second highly correlated group is ROPs { ζ and η }. This group has a strong correlation to P P x . The last high correlation relates to ROP { ψ } that is highly correlated to P P y . Figure 7 demonstrates absolute values of normalized cross-correlations between all free parameters of the MPC. This figure demonstrates a slightly repeated pattern among cameras. Two diagonal parts of this figure are zero that are related to the structure of the first camera and lever-arm vector and boresight misalignment that kept fixed during the indoor calibration.

4.2. Mobile Mapping System Calibration

Table 3 shows the calibrated values for the lever-arm vector and boresight misalignments and their standard deviation values. As expected, standard deviation values were larger than structural parameters (ROPs of the MPC). On average, 0.5-degree standard deviation were obtained for boresight misalignments, and 2 cm for each component of lever-arm vector.
In the adjustment processing, on average 9 cm residuals were obtained for the GCPs (Table 4) on the outdoor calibration site (Figure 8). All GCPs had at-least 6 intersecting rays. Table 5 shows the differences between the image positions and orientations estimated by the BBA and observed by the GNSS/IMU. On average, approximately 0.2 ° difference was observed between GNSS/IMU angles and BBA outputs. For the directly measured positions, an average difference of 11.3, 2.8, 0.6 cm in the X, Y, and Z coordinates, respectively, were observed.

4.3. Accuracy Assessment

Figure 9 and Figure 10 show the configuration of cross-check dataset (1) and (2) respectively and Table 6 and Table 7 shows the errors for check-points, which are differences between their intersected and observed values. On average, error of 5.6 cm and 1.4 cm was observed between intersected and observed positions of check points for cross-check dataset (1) and (2), respectively. The RMSEs were 0.69 cm and 0.18 cm, respectively. Relatively comparable GPS errors for close-by observations resulted to a similarity between errors in 3D check points of a dataset.
Figure 11 plots sorted incident angles (degrees) with their corresponding errors. This plot high-lights the effect of intersection-geometry’s strength on the quality of positioning. In this plot, intersections of rays with maximum incident angle less than 20 degrees resulted in higher fluctuation in 3D-position inaccuracy. The inaccuracy in 3D positions was significantly improved when the incident angles took values over 20 degree. A logical conclusion from this figure is that a stronger intersection geometry is obtained as incident angles come closer to π 2 .
Figure 12a shows errors for 3D check point. Figure 12b shows the corresponding intersecting angles. In Figure 12a, most errors were less than 7 cm. By comparing these two figures, a negative correlation between incident angles and accuracy was obvious which was also concluded from Figure 11. On average, 4.2 cm error was observed for all 3D check points with RMSE 3.6 cm.

4.4. Panoramic Compilation

Panoramic images were compiled using the IOPs and ROPs. Figure 13 demonstrates footprints of individual cameras on the final panoramic compilation as a contribution map of individual cameras. This map demonstrates that each individual image covers a suitable amount of area in the final panoramic compilation. It consequently confirms that the underlying MPC is well-designed.
The ‘minimum incident angle’ criterion is employed to create a unique correspondence map (Figure 14) from the contribution map (Figure 13). The correspondence map connects every pixel in the final panoramic compilation to a pixel in a camera. Therefore, a vector of 3 numbers (image id, x and y in pixel) were saved for every pixel of the final panorama as the correspondence map. Approximately 7 minutes of computational time was required to build a correspondence map of 7200 × 3600 pixels; then it was efficiently quick and straight-forward to compile any new panorama from individual images (~2.1 sec.). Non-stitching panoramic compilation was performed by employing the correspondence map and resulted into geometrically-accurate panoramas (Figure 15). Closer objects to the MPC had, as expected, larger discontinuity over the edges than far objects. An edge in this context means the boundary that pixel labels change. The discontinuity on edges has been considered by [51] as systematic errors; however, by using the correspondence map this systematic error is totally irrelevant and is completely eliminated and avoided.

5. Discussion

In this work, we followed a similar technique as [11] in employing free-network calibration inside an indoor calibration room; however, we pushed this work in three directions. Firstly, our modified BBA model contains offset and misalignment with respect to the GNSS/IMU. Secondly, a non-stitching panorama was introduced along with a correspondence map which adds geometric values to the non-stitching panorama scheme. The new scheme improves the development of [51] by taking the systematic error of edges into equations. Thirdly, we demonstrated error propagations through a statistical model as standard deviation values of the parameters under-investigation.
A complete calibration scheme for a multi-camera mobile mapping system (MMS) was presented to calibrate a multi projective camera (MPC) to GNSS and IMU. It was based on two steps: indoor MPC calibration, and outdoor MMS calibration, which calibrated an MPC with respect to the GNSS and IMU. Most of recent MPC calibration schemes are based on reducing the number of interior orientation parameters (IOP) in order to avoid singularity. In this work, the singular situation was addressed by employing a photogrammetric calibration room. Our modified bundle block adjustment model considers offset and misalignment of an MPC with respect to the GNSS/IMU, which enables 3D reconstruction using the MMS either in a direct georeferencing mode, or via integrated bundle block adjustment. Furthermore, a non-stitching panoramic compilation scheme was introduced along with a correspondence map which connects pixels of a compiled panorama to their correct position in the MPC. The new scheme takes the systematic error of discontinuous edges into equations. We finally demonstrated error propagations through a statistical model as standard deviation values of the parameters under-investigation. The statistical modeling was proved as a helpful tool to propagate the uncertainties from observations to unknowns based on a non-linear least-square model.
The proposed calibration scheme was successfully demonstrated through five crosscheck datasets. On average 4.2 cm 3D error residuals were observed for cross-check object points. This level of accuracy highlights the usability of a multi-camera MMS in many surveying tasks. Standard deviation for lever-arm vector and boresight misalignments were ~0.5 degree and ~2 cm, respectively. One reason for relatively high uncertainties for lever-arm angles and boresight misalignments in comparison to the lower uncertainties that resulted from the indoor calibration was related to the low number of observational equations in the MMS adjustment process. Those uncertainties are expected to be improved by employing more observations. Our results showed that the intersection geometry was an important factor on 3D positioning. Our rigorous approach forms the basis for developing automated processing of MMS datasets in practical applications. Our further research topics will include the assessment of image-based point clouds in reconstructing different scenes, such as urban scenes and road environments. Furthermore, autonomous object mapping based on the proposed technique is one of our future research objectives.

Author Contributions

Conceptualization, E.K., E.H., and T.M.; Methodology, E.K.; Software, E.K.; Validation, E.K., M.B.C., A.M.G.T., and E.H.; Investigation, E.K., A.K., H.K., and N.V.; Resources, H.K., A.K., and T.M.; Writing—original draft preparation, E.K.; Writing—review and editing, E.K., M.B.C., A.M.G.T., N.V., T.M., H.K., A.K., and E.H.; Visualization, E.K.; Supervision, T.M. and E.H.; Project administration, T.M. and E.H.; Funding acquisition, E.H. and H.K.

Funding

This research was funded by the National Land Survey of Finland Development Project Funds “Kansallinen Maastotietokanta – Uudet Kuvamittaustekniikat” and the Strategic Research Council at the Academy of Finland “Competence Based Growth Through Integrated Disruptive Technologies of 3D Digitalization, Robotics, Geospatial Information and Image Processing/Computing Point Cloud Ecosystem” (293389/314312) and Academy of Finland “Multi-spectral personal laser scanning for automated environment characterization” (300066).

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Schwarz, K.P.; El-Sheimy, N. Mobile mapping systems–state of the art and future trends. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2004, 35 Pt B, 10. [Google Scholar]
  2. Toschi, I.; Rodríguez-Gonzálvez, P.; Remondino, F.; Minto, S.; Orlandini, S.; Fuller, A. Accuracy evaluation of a mobile mapping system with advanced statistical methods. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 245–253. [Google Scholar] [CrossRef]
  3. King, B. Methods for the photogrammetric adjustment of bundles of constrained stereopairs. Int. Arch. Photogramm. Remote Sens. 1994, 30, 473–480. [Google Scholar]
  4. Morgan, M.F.; Habib, A.F. Automatic calibration of low-cost digital cameras. Opt. Eng. 2003, 42, 948–956. [Google Scholar] [CrossRef]
  5. Tommaselli, A.M.G.; Galo, M.; De Moraes, M.V.A.; Marcato, J.; Caldeira, C.R.T.; Lopes, R.F. Generating Virtual Images from Oblique Frames. Remote. Sens. 2013, 5, 1875–1893. [Google Scholar] [CrossRef]
  6. Lichti, D.D.; Sharma, G.B.; Kuntze, G.; Mund, B.; Beveridge, J.E.; Ronsky, J.L. Rigorous Geometric Self-Calibrating Bundle Adjustment for a Dual Fluoroscopic Imaging System. IEEE Trans. Med. Imaging 2014, 34, 589–598. [Google Scholar] [CrossRef] [PubMed]
  7. Campos, M.B.; Tommaselli, A.M.G.; Junior, J.M.; Honkavaara, E. Geometric model and assessment of a dual-fisheye imaging system. Photogramm. Rec. 2018, 33, 243–263. [Google Scholar] [CrossRef]
  8. Detchev, I.; Habib, A.; Mazaheri, M.; Lichti, D. Practical In Situ Implementation of a Multicamera Multisystem Calibration. J. Sens. 2018, 2018, 1–12. [Google Scholar] [CrossRef]
  9. An, G.H.; Lee, S.; Seo, M.-W.; Yun, K.; Cheong, W.-S.; Kang, S.-J. Charuco Board-Based Omnidirectional Camera Calibration Method. Electronics 2018, 7, 421. [Google Scholar] [CrossRef]
  10. Khoramshahi, E.; Honkavaara, E. Modelling and automated calibration of a general multi-projective camera. Photogramm. Rec. 2018, 33, 86–112. [Google Scholar] [CrossRef]
  11. Jarron, D.; Lichti, D.D.; Shahbazi, M.M.; Radovanovic, R.S. Multi-Camera Panormamic Imaging System Calibration. 2019. Available online: https://prism.ucalgary.ca/handle/1880/110580 (accessed on 22 October 2019).
  12. Parian, J.A.; Gruen, A. A sensor model for panoramic cameras. In Proceedings of the 6th Optical 3D Measurement Techniques, Zurich, Switzerland, 22–25 September 2003; Volume 2, pp. 130–141. [Google Scholar]
  13. Maas, H.-G. Close range photogrammetry sensors. In Advances in Photogrammetry, Remote Sensing and Spatial Information Sciences: 2008 ISPRS Congress Book; CRS Press: London, UK, 2008; pp. 81–90. [Google Scholar]
  14. Scheibe, K.; Korsitzky, H.; Reulke, R.; Scheele, M.; Solbrig, M. Eyescan-a high resolution digital panoramic camera. In International Workshop on Robot Vision; Springer: Auckland, New Zealand, 2001; pp. 77–83. [Google Scholar]
  15. Kauhanen, H.; Rönnholm, P.; Lehtola, V.V. Motorized Panoramic Camera Mount–Calibration and Image Capture. ISPRS Ann. Photogramm. Remote. Sens. Spat. Inf. Sci. 2016, 3, 89–96. [Google Scholar] [CrossRef]
  16. Fangi, G. The Multi-image spherical Panoramas as a tool for Architectural Survey. CIPA Herit. Doc. 2011, 21, 311–316. [Google Scholar]
  17. Fangi, G.; Nardinocchi, C. Photogrammetric Processing of Spherical Panoramas. Photogramm. Rec. 2013, 28, 293–311. [Google Scholar] [CrossRef]
  18. Schneider, D.; Maas, H.-G. Geometric modelling and calibration of a high resolution panoramic camera. Opt. 3-Meas. Technol. VI 2003, 2, 122–129. [Google Scholar]
  19. Schneider, D.; Maas, H. Application and accuracy potential of a strict geometric model for rotating line cameras. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2004, 34, 5. [Google Scholar]
  20. Schneider, D.; Maas, H.-G. A geometric model for linear-array-based terrestrial panoramic cameras. Photogramm. Rec. 2006, 21, 198–210. [Google Scholar] [CrossRef]
  21. Parian, J.A.; Gruen, A. An advanced sensor model for panoramic cameras. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2004, 35, 24–29. [Google Scholar]
  22. Barazzetti, L.; Roncoroni, F.; Previtali, M. 3d Modelling with the Samsung Gear 360. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, XLII-2/W3, 85–90. [Google Scholar] [CrossRef]
  23. Song, W.; Liu, X.; Lu, P.; Huang, Y.; Weng, D.; Zheng, Y.; Liu, Y.; Wang, Y. Design and assessment of a 360° panoramic and high-performance capture system with two tiled catadioptric imaging channels. Appl. Opt. 2018, 57, 3429–3437. [Google Scholar] [CrossRef]
  24. Lian, T.; Farrell, J.; Wandell, B. Image Systems Simulation for 360° Camera Rigs. Electron. Imaging 2018, 2018, 1–5. [Google Scholar] [CrossRef]
  25. Ray, S. The Fisheye Lens and Immersed Optics; Focal Press: New York, NY, USA, 2002; pp. 326–332. [Google Scholar] [CrossRef]
  26. Abraham, S.; Förstner, W. Fish-eye-stereo calibration and epipolar rectification. ISPRS J. Photogramm. Remote Sens. 2005, 59, 278–288. [Google Scholar] [CrossRef]
  27. Schwalbe, E. Geometric modelling and calibration of fisheye lens camera systems. In Proceedings of the Panoramic Photogrammetry Workshop, Berlin, Germany, 24–25 February 2005; pp. 5–8. [Google Scholar]
  28. Schneider, D.; Schwalbe, E.; Maas, H.-G. Validation of geometric models for fisheye lenses. ISPRS J. Photogramm. Remote Sens. 2009, 64, 259–266. [Google Scholar] [CrossRef]
  29. Hughes, C.; Denny, P.; Jones, E.; Glavin, M. Accuracy of fish-eye lens models. Appl. Opt. 2010, 49, 3338–3347. [Google Scholar] [CrossRef] [PubMed]
  30. Aghayari, S.; Saadatseresht, M.; Omidalizarandi, M.; Neumann, I. Geometric calibration of full spherical panoramic Ricoh-Theta camera. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. IV-1W1 2017, 4, 237–245. [Google Scholar] [CrossRef]
  31. Barazzetti, L.; Previtali, M.; Roncoroni, F. 3D modelling with the Samsung Gear 360. In Proceedings of the 2017 TC II and CIPA-3D Virtual Reconstruction and Visualization of Complex Architectures, Nafplio, Greece, 1–3 March 2017; Volume 42, pp. 85–90. [Google Scholar]
  32. Campos, M.B.; Tommaselli, A.M.G.; Honkavaara, E.; Prol, F.D.S.; Kaartinen, H.; El Issaoui, A.; Hakala, T. A Backpack-Mounted Omnidirectional Camera with Off-the-Shelf Navigation Sensors for Mobile Terrestrial Mapping: Development and Forest Application. Sensors 2018, 18, 827. [Google Scholar] [CrossRef]
  33. Blaser, S.; Cavegn, S.; Nebiker, S. Developmet of a Portable High Performance Mobile Mapping System Using the Robot Operating System. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 4, 13–20. [Google Scholar] [CrossRef]
  34. Fangi, G.; Pierdicca, R.; Sturari, M.; Malinverni, E. Improving Spherical Photogrammetry Using 360° Omni-Caneras: Use Cases and New Applications. ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, XLII–2, 331–337. [Google Scholar] [CrossRef]
  35. Losè, L.T.; Chiabrando, F.; Spanò, A. Preliminary Evaluation of a Commercial 360 Multu-Camera Rig for Photogrammetric Purposes. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2018, 42, 1113–1120. [Google Scholar] [CrossRef]
  36. He, G.; Novak, K.; Feng, W. Stereo camera system calibration with relative orientation constraints. Videometrics 1993, 1820, 2–8. [Google Scholar]
  37. Zhuang, H. A self-calibration approach to extrinsic parameter estimation of stereo cameras. Robot. Auton. Syst. 1995, 15, 189–197. [Google Scholar] [CrossRef]
  38. Svoboda, T.; Hug, H.; van Gool, L. ViRoom—Low cost synchronized multicamera system and its self-calibration. In Joint Pattern Recognition Symposium; Springer: Berlin/Heidelberg, Germany, 2002; pp. 515–522. [Google Scholar]
  39. Lerma, J.L.; Navarro, S.; Cabrelles, M.; Seguí, A.E. Camera calibration with baseline distance constraints. Photogramm. Rec. 2010, 25, 140–158. [Google Scholar] [CrossRef]
  40. Habib, A.; Detchev, I.; Kwak, E. Stability analysis for a multi-camera photogrammetric system. Sensors 2014, 14, 15084–15112. [Google Scholar] [CrossRef] [PubMed]
  41. Detchev, I.; Mazaheri, M.; Rondeel, S.; Habib, A. Calibration of multi-camera photogrammetric systems ISPRS-Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, XL–1, 101–108. [Google Scholar]
  42. Garrido-Jurado, S.; Muñoz-Salinas, R.; Madrid-Cuevas, F.; Marín-Jiménez, M. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit. 2014, 47, 2280–2292. [Google Scholar] [CrossRef]
  43. Scaramuzza, D. Omnidirectional camera. In Computer Vision: A Reference Guide; Springer: New York, NY, USA, 2014; pp. 552–560. [Google Scholar]
  44. Junior, J.M.; Tommaselli, A.; Moraes, M. Calibration of a catadioptric omnidirectional vision system with conic mirror. ISPRS J. Photogramm. Remote Sens. 2016, 113, 97–105. [Google Scholar] [CrossRef]
  45. Cramer, M.; Stallmann, D.; Haala, N. Direct Georeferencing Using GPS/Inertial Exterior Orientations for Photogrammetric Applications. Int. Arch. Photogramm. Remote Sens. 2000, 33, 198–205. [Google Scholar]
  46. Habib, A.; Zhou, T.; Masjedi, A.; Zhang, Z.; Flatt, J.E.; Crawford, M. Boresight Calibration of GNSS/INS-Assisted Push-Broom Hyperspectral Scanners on UAV Platforms. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2018, 11, 1734–1749. [Google Scholar] [CrossRef]
  47. Brown, D.C. Decentering Distortion of Lenses. 1966, Volume 32, pp. 444–462. Available online: https://www.semanticscholar.org/paper/Decentering-distortion-of-lenses-Brown/2ef001c656378a1c5cf80488b35684742220d3f9 (accessed on 22 November 2019).
  48. Gruen, A.; Huang, T.S. Calibration and Orientation of Cameras in Computer Vision; Springer Science & Business Media: Berlin, Germany, 2013; Volume 34. [Google Scholar]
  49. Nistér, D. An efficient solution to the five-point relative pose problem. IEEE Trans. Pattern Anal. Mach. Intell. 2004, 26, 756–777. [Google Scholar] [CrossRef]
  50. Hartley, R.I. In defence of the 8-point algorithm. In Proceedings of the IEEE International Conference on Computer Vision, Cambridge, MA, USA, 20–23 June 1995; pp. 1064–1070. [Google Scholar]
  51. Rau, J.-Y.; Su, B.; Hsiao, K.; Jhan, J. Systematic Calibration for a Backpacked Spherical Photogrammetric Imaging System. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, 41. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Configuration of GNSS, IMU and the multi-camera.
Figure 1. Configuration of GNSS, IMU and the multi-camera.
Remotesensing 11 02778 g001
Figure 2. Ladybug 5+ (in front), NovAtel IMU-ISA-100C (black-and-white box in rear) and NovAtel VEXXIS fiber optic gyros and temperature compensated Micro Electromechanical Systems (MEMS) accelerometers. IMU measurements are used by the SPAN receiver to compute a blended GNSS+INS position, velocity, and attitude solution at rates up to 200 Hz.
Figure 2. Ladybug 5+ (in front), NovAtel IMU-ISA-100C (black-and-white box in rear) and NovAtel VEXXIS fiber optic gyros and temperature compensated Micro Electromechanical Systems (MEMS) accelerometers. IMU measurements are used by the SPAN receiver to compute a blended GNSS+INS position, velocity, and attitude solution at rates up to 200 Hz.
Remotesensing 11 02778 g002
Figure 3. Orthophoto of central Inkoo area. Blue rectangle is the outdoor calibration dataset. Red rectangles are check sites. Orange dots are locations of captured multi-images.
Figure 3. Orthophoto of central Inkoo area. Blue rectangle is the outdoor calibration dataset. Red rectangles are check sites. Orange dots are locations of captured multi-images.
Remotesensing 11 02778 g003
Figure 4. FGI’s calibration room. (a) 3D location of coded targets (b) panorama of the room.
Figure 4. FGI’s calibration room. (a) 3D location of coded targets (b) panorama of the room.
Remotesensing 11 02778 g004
Figure 5. A distorted image (a) and its corresponding undistorted image (b) from camera 1 of the Ladybug V5.
Figure 5. A distorted image (a) and its corresponding undistorted image (b) from camera 1 of the Ladybug V5.
Remotesensing 11 02778 g005
Figure 6. Absolute values of normalized cross-correlations for ‘camera 2’ after indoor calibration of the LadyBug camera.
Figure 6. Absolute values of normalized cross-correlations for ‘camera 2’ after indoor calibration of the LadyBug camera.
Remotesensing 11 02778 g006
Figure 7. Absolute values of normalized cross-correlations for ‘all cameras’ after indoor calibration of the LadyBug camera.
Figure 7. Absolute values of normalized cross-correlations for ‘all cameras’ after indoor calibration of the LadyBug camera.
Remotesensing 11 02778 g007
Figure 8. Configuration of the outdoor calibration dataset. (a) 3D project, (b) orthophoto overlaid with image positions and GCPs.
Figure 8. Configuration of the outdoor calibration dataset. (a) 3D project, (b) orthophoto overlaid with image positions and GCPs.
Remotesensing 11 02778 g008
Figure 9. Configuration of the cross-check dataset1. (a) 3D project of dataset1, (b) orthophoto overlaid with image positions and GCPs.
Figure 9. Configuration of the cross-check dataset1. (a) 3D project of dataset1, (b) orthophoto overlaid with image positions and GCPs.
Remotesensing 11 02778 g009
Figure 10. Configuration of the cross-check dataset2. (a) 3D project of dataset2, (b) orthophoto overlaid with image positions and GCPs.
Figure 10. Configuration of the cross-check dataset2. (a) 3D project of dataset2, (b) orthophoto overlaid with image positions and GCPs.
Remotesensing 11 02778 g010
Figure 11. Sorted re-projection errors and their corresponding incident angles.
Figure 11. Sorted re-projection errors and their corresponding incident angles.
Remotesensing 11 02778 g011
Figure 12. Error in check points in all areas. (a) Differences between measured GPS positions and direct-georeferencing positions, (b) the corresponding intersecting angles.
Figure 12. Error in check points in all areas. (a) Differences between measured GPS positions and direct-georeferencing positions, (b) the corresponding intersecting angles.
Remotesensing 11 02778 g012
Figure 13. Contribution and overlaps of all cameras pixels (1–6) in the final panoramic compilation.
Figure 13. Contribution and overlaps of all cameras pixels (1–6) in the final panoramic compilation.
Remotesensing 11 02778 g013
Figure 14. Correspondence map (min incident angle criteria) of all images (1–6) in the final panoramic compilation.
Figure 14. Correspondence map (min incident angle criteria) of all images (1–6) in the final panoramic compilation.
Remotesensing 11 02778 g014
Figure 15. Compilation of a non-stitching panorama (no color blending).
Figure 15. Compilation of a non-stitching panorama (no color blending).
Remotesensing 11 02778 g015
Table 1. Calibrated interior orientation of individual cameras of the LadyBug camera.
Table 1. Calibrated interior orientation of individual cameras of the LadyBug camera.
Cam id.123456
F (px)1247.631243.071242.201240.031245.661241.88
std.0.040.070.050.040.030.07
P P x (px)1217.371204.011219.651218.621230.571229.84
std.0.010.080.070.040.040.10
P P y (px)1036.311012.601024.031028.361036.771021.02
std.0.010.110.070.050.050.10
K 1 0.3839460.3719830.3883190.3802700.3849720.388471
std.2.00 × 10−44.00 × 10−43.00 × 10−42.00 × 10−42.00 × 10−43.00 × 10−4
K 2 0.0175400.0357740.0030230.0238470.0135120.003508
std.5.00 × 10−48.00 × 10−46.00 × 10−44.00 × 10−44.00 × 10−46.00 × 10−4
K 3 0.1772680.1669160.1915460.1757970.1803820.187534
std.3.00 × 10−46.00 × 10−44.00 × 10−43.00 × 10−42.00 × 10−44.00 × 10−4
P 1 −4.8 × 10−4−6.0 × 10−4−6.3 × 10−41.9 × 10−47.8 × 10−4−5.8 × 10−4
std.1.0 × 10−51.0 × 10−51.0 × 10−51.0 × 10−51.0 × 10−52.0 × 10−5
P 2 −1.9 × 10−4−9.0 × 10−46.2 × 10−45.5 × 10−42.6 × 10−41.6 × 10−4
std.1.1 × 10−51.9 × 10−51.4 × 10−59.0 × 10−61.0 × 10−51.8 × 10−5
δ −3.4 × 10−4−1.7 × 10−4−5.2 × 10−4−2.7 × 10−4−1.7 × 10−4−7.8 × 10−4
std.1.0 × 10−53.0 × 10−52.0 × 10−51.0 × 10−51.0 × 10−53.0 × 10−5
λ −3.0 × 10−53.3 × 10−57.5 × 10−54.7 × 10−5−6.0 × 10−58.6 × 10−5
std.7.0 × 10−61.0 × 10−69.0 × 10−65.0 × 10−65.0 × 10−61.0 × 10−6
Table 2. Calibrated structure of the LadyBug camera (orientations in degree and positions in (cm)).
Table 2. Calibrated structure of the LadyBug camera (orientations in degree and positions in (cm)).
Id. ζ ° η ° ψ ° s t d .   ζ ° s t d .   η ° s t d .   ψ °
10000.00050.00050.0005
271.96359.70359.810.0050.0010.004
3143.92359.660.130.0030.0030.000006
4216.04359.80359.990.0020.0010.001
5288.00359.97359.970.0020.00050.002
629.6389.80150.450.630.0030.63
Δ x Δ y Δ z s t d .   Δ x s t d .   Δ y s t d .   Δ z
10000.0020.0020.002
20.03885.70224.17700.00220.00360.003
30.01563.535110.9220.00210.00240.003
4−0.0239−3.548710.8830.00160.00210.002
5−0.0356−5.74544.14500.00170.00250.002
6−7.5553−0.02935.96170.00410.00310.003
Table 3. Calibrated values for lever-arm vector and boresight misalignments.
Table 3. Calibrated values for lever-arm vector and boresight misalignments.
Boresight   ( ° ) B S ω B S ϕ B S κ s t d .   B S ω s t d .   B S ϕ s t d .   B S κ
−105.80−41.4169.950.530.110.45
Lever−arm (m) L A X L A Y L A Z s t d .   L A X s t d .   L A Y s t d .   L A Z
−2.4189−0.28240.73610.020.020.02
Table 4. Residuals for the ground control points of the outdoor calibration dataset.
Table 4. Residuals for the ground control points of the outdoor calibration dataset.
Point id. Num.
XYZLengthIntersect.
137−1.69.12.29.56
1386.25.03.18.66
13912.6−10.5−2.416.56
140−14.21.4−0.614.36
141−13.54.51.414.36
1424.1−7.50.18.56
1433.0−2.01.43.86
Mean−0.4800.7410.78
RMSE9.316.551.714.10
Table 5. Differences between adjusted values from BBA and direct-georeferencing for the outdoor calibration dataset.
Table 5. Differences between adjusted values from BBA and direct-georeferencing for the outdoor calibration dataset.
Camera id. Δ ( ω ) ° Δ ( ϕ ) ° Δ ( κ ) ° Δ ( X 0 ) c m Δ ( Y 0 ) c m Δ ( Z 0 ) c m
10.380.130.0714.82−4.11−4.24
20.370.380.407.14−1.58−3.56
30.020.080.042.85−0.61−1.97
4−0.42−0.22−0.15−1.00−1.48−0.04
5−0.17−0.31−0.32−5.931.760.17
6−0.19−0.06−0.03−13.162.70−0.09
Mean0.260.200.177.482.041.68
RMSE0.290.230.228.982.251.77
Table 6. Differences between observed and intersected position of check points in check1 dataset.
Table 6. Differences between observed and intersected position of check points in check1 dataset.
Point ID(cm)Num.
XYZLengthIntersect.
187−4.526−2.30480.49745.10346
188−4.4479−2.28050.53115.02655
189−4.599−2.31420.49675.17233
190−4.5909−2.31260.50215.1653
191−4.2558−2.30150.52254.86646
193−4.9979−0.85230.38655.084814
183−6.3937−0.50210.46236.43013
184−6.3916−0.49690.45926.42733
185−6.4107−0.44840.48176.44443
186−6.4102−0.44920.47716.44363
192−4.5299−2.20950.40775.05654
184−6.3916−0.49690.45926.42733
Mean−5.32−1.440.475.63
RMSE0.910.870.040.67
Table 7. Differences between observed and intersected position of check points in check2 dataset.
Table 7. Differences between observed and intersected position of check points in check2 dataset.
Point ID(cm)Num.
XYZLengthIntersect.
1640.91020.7129−0.52991.271812
1651.03540.6624−0.51991.334612
1661.18040.6085−0.52371.427512
1671.32310.5529−0.51261.522912
1681.18080.7859−0.49181.501310
1691.28430.879−0.44981.629
1700.92480.1996−0.47271.057610
1711.08580.3109−0.47961.2279
1721.24630.4695−0.52751.432511
1731.40030.4883−0.521.571511
Mean1.150.56−0.505.63
RMSE0.020.030.00070.02

Share and Cite

MDPI and ACS Style

Khoramshahi, E.; Campos, M.B.; Tommaselli, A.M.G.; Vilijanen, N.; Mielonen, T.; Kaartinen, H.; Kukko, A.; Honkavaara, E. Accurate Calibration Scheme for a Multi-Camera Mobile Mapping System. Remote Sens. 2019, 11, 2778. https://0-doi-org.brum.beds.ac.uk/10.3390/rs11232778

AMA Style

Khoramshahi E, Campos MB, Tommaselli AMG, Vilijanen N, Mielonen T, Kaartinen H, Kukko A, Honkavaara E. Accurate Calibration Scheme for a Multi-Camera Mobile Mapping System. Remote Sensing. 2019; 11(23):2778. https://0-doi-org.brum.beds.ac.uk/10.3390/rs11232778

Chicago/Turabian Style

Khoramshahi, Ehsan, Mariana Batista Campos, Antonio Maria Garcia Tommaselli, Niko Vilijanen, Teemu Mielonen, Harri Kaartinen, Antero Kukko, and Eija Honkavaara. 2019. "Accurate Calibration Scheme for a Multi-Camera Mobile Mapping System" Remote Sensing 11, no. 23: 2778. https://0-doi-org.brum.beds.ac.uk/10.3390/rs11232778

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop