TY - GEN
T1 - Integrated navigation system using camera and gimbaled laser scanner for indoor and outdoor autonomous flight of UAVs
AU - Huh, Sungsik
AU - Shim, David Hyunchul
AU - Kim, Jonghyuk
PY - 2013
Y1 - 2013
N2 - This paper describes an integrated navigation sensor module, including a camera, a laser scanner, and an inertial sensor, for unmanned aerial vehicles (UAVs) to fly both indoors and outdoors. The camera and the gimbaled laser sensor work in a complementary manner to extract feature points from the environment around the vehicle. The features are processed using an online extended Kalman filter (EKF) in simultaneous localization and mapping (SLAM) algorithm to estimate the navigational states of the vehicle. In this paper, a new method is proposed for calibrating a camera and a gimbaled laser sensor. This calibration method uses a simple visual marker to calibrate the camera and the laser scanner with each other. We also propose a real-time navigation algorithm based on the EKF SLAM algorithm, which is suitable for our camera-laser sensor package. The algorithm merges image features with laser range data for state estimation. Finally, these sensors and algorithms are implemented on our octo-rotor UAV platform and the result shows that our onboard navigation module can provide a real-time three-dimensional navigation solution without any assumptions or prior information on the surroundings.
AB - This paper describes an integrated navigation sensor module, including a camera, a laser scanner, and an inertial sensor, for unmanned aerial vehicles (UAVs) to fly both indoors and outdoors. The camera and the gimbaled laser sensor work in a complementary manner to extract feature points from the environment around the vehicle. The features are processed using an online extended Kalman filter (EKF) in simultaneous localization and mapping (SLAM) algorithm to estimate the navigational states of the vehicle. In this paper, a new method is proposed for calibrating a camera and a gimbaled laser sensor. This calibration method uses a simple visual marker to calibrate the camera and the laser scanner with each other. We also propose a real-time navigation algorithm based on the EKF SLAM algorithm, which is suitable for our camera-laser sensor package. The algorithm merges image features with laser range data for state estimation. Finally, these sensors and algorithms are implemented on our octo-rotor UAV platform and the result shows that our onboard navigation module can provide a real-time three-dimensional navigation solution without any assumptions or prior information on the surroundings.
UR - http://www.scopus.com/inward/record.url?scp=84893801077&partnerID=8YFLogxK
U2 - 10.1109/IROS.2013.6696805
DO - 10.1109/IROS.2013.6696805
M3 - Conference contribution
SN - 9781467363587
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 3158
EP - 3163
BT - IROS 2013
T2 - 2013 26th IEEE/RSJ International Conference on Intelligent Robots and Systems: New Horizon, IROS 2013
Y2 - 3 November 2013 through 8 November 2013
ER -