arXiv Analytics

Sign in

arXiv:1804.05178 [cs.CV]AbstractReferencesReviewsResources

LiDAR and Camera Calibration using Motion Estimated by Sensor Fusion Odometry

Ryoichi Ishikawa, Takeshi Oishi, Katsushi Ikeuchi

Published 2018-04-14Version 1

In this paper, we propose a method of targetless and automatic Camera-LiDAR calibration. Our approach is an extension of hand-eye calibration framework to 2D-3D calibration. By using the sensor fusion odometry method, the scaled camera motions are calculated with high accuracy. In addition to this, we clarify the suitable motion for this calibration method. The proposed method only requires the three-dimensional point cloud and the camera image and does not need other information such as reflectance of LiDAR and to give initial extrinsic parameter. In the experiments, we demonstrate our method using several sensor configurations in indoor and outdoor scenes to verify the effectiveness. The accuracy of our method achieves more than other comparable state-of-the-art methods.

Related articles: Most relevant | Search more
arXiv:2104.09333 [cs.CV] (Published 2021-04-19)
Camera Calibration and Player Localization in SoccerNet-v2 and Investigation of their Representations for Action Spotting
arXiv:2403.04583 [cs.CV] (Published 2024-03-07, updated 2024-03-10)
Unbiased Estimator for Distorted Conics in Camera Calibration
arXiv:2110.03479 [cs.CV] (Published 2021-10-07, updated 2022-01-24)
Camera Calibration through Camera Projection Loss