TY - GEN
T1 - Single-shot extrinsic calibration of a generically configured RGB-D camera rig from scene constraints
AU - Yang, Jiaolong
AU - Dai, Yuchao
AU - Li, Hongdong
AU - Gardner, Henry
AU - Jia, Yunde
PY - 2013
Y1 - 2013
N2 - With the increasing use of commodity RGB-D cameras for computer vision, robotics, mixed and augmented reality and other areas, it is of significant practical interest to calibrate the relative pose between a depth (D) camera and an RGB camera in these types of setups. In this paper, we propose a new single-shot, correspondence-free method to extrinsically calibrate a generically configured RGB-D camera rig. We formulate the extrinsic calibration problem as one of geometric 2D-3D registration which exploits scene constraints to achieve single-shot extrinsic calibration. Our method first reconstructs sparse point clouds from a single-view 2D image. These sparse point clouds are then registered with dense point clouds from the depth camera. Finally, we directly optimize the warping quality by evaluating scene constraints in 3D point clouds. Our single-shot extrinsic calibration method does not require correspondences across multiple color images or across different modalities and it is more flexible than existing methods. The scene constraints can be very simple and we demonstrate that a scene containing three sheets of paper is sufficient to obtain reliable calibration and with a lower geometric error than existing methods.
AB - With the increasing use of commodity RGB-D cameras for computer vision, robotics, mixed and augmented reality and other areas, it is of significant practical interest to calibrate the relative pose between a depth (D) camera and an RGB camera in these types of setups. In this paper, we propose a new single-shot, correspondence-free method to extrinsically calibrate a generically configured RGB-D camera rig. We formulate the extrinsic calibration problem as one of geometric 2D-3D registration which exploits scene constraints to achieve single-shot extrinsic calibration. Our method first reconstructs sparse point clouds from a single-view 2D image. These sparse point clouds are then registered with dense point clouds from the depth camera. Finally, we directly optimize the warping quality by evaluating scene constraints in 3D point clouds. Our single-shot extrinsic calibration method does not require correspondences across multiple color images or across different modalities and it is more flexible than existing methods. The scene constraints can be very simple and we demonstrate that a scene containing three sheets of paper is sufficient to obtain reliable calibration and with a lower geometric error than existing methods.
KW - Digitization and Image Capture-Camera calibration; H.5.1 [INFORMATION INTERFACES AND PRESENTATION]
KW - Enhancement Registration
KW - I.4.1 [Image Processing and Computer Vision]
KW - Multimedia Information SystemsArtificial, augmented, and virtual realities; I.4.8 [Image Processing and Computer Vision]
KW - Scene AnalysisRange data; I.4.3 [Image Processing and Computer Vision]
UR - http://www.scopus.com/inward/record.url?scp=84893305780&partnerID=8YFLogxK
U2 - 10.1109/ISMAR.2013.6671778
DO - 10.1109/ISMAR.2013.6671778
M3 - Conference contribution
SN - 9781479928699
T3 - 2013 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2013
SP - 181
EP - 188
BT - 2013 IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2013
T2 - 12th IEEE and ACM International Symposium on Mixed and Augmented Reality, ISMAR 2013
Y2 - 1 October 2013 through 4 October 2013
ER -