Inertial-kinect fusion for outdoor 3D Navigation

Usman Qayyum*, Jonghyuk Kim

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    12 Citations (Scopus)

    Abstract

    The lightweight and low-cost 3D sensing device, such as Microsoft Kinect, has gained much at- tention in computer vision and robotics com- munity. Although quite promising and success- ful for indoor applications, its outdoor usage has been significantly hampered by its short de- tection range (around 4 meters) coupled with ambient infrared interference. This paper ad- dresses the theoretical and practical develop- ment of an Inertial-Kinect fused SLAM frame- work that can handle the 3D to 2D degenera- tion in Kinect sensing, called a depth dropout problem. The vision node is designed to pro- vide either full 6DOF or partial 5DOF vehi- cle pose measurements depending on the depth availability, whilst the low-cost inertial system designed in house (less than $40AUD) enables continuous metric mapping and navigation. In- door and outdoor experiment results are pro- vided, demonstrating the robustness of the pro- posed approach in a challenging environmental conditions.

    Original languageEnglish
    Title of host publicationAustralasian Conference on Robotics and Automation, ACRA
    PublisherAustralasian Robotics and Automation Association
    ISBN (Electronic)9780980740448
    Publication statusPublished - 2013
    Event2013 Australasian Conference on Robotics and Automation, ACRA 2013 - Sydney, Australia
    Duration: 2 Dec 20134 Dec 2013

    Publication series

    NameAustralasian Conference on Robotics and Automation, ACRA
    ISSN (Print)1448-2053

    Conference

    Conference2013 Australasian Conference on Robotics and Automation, ACRA 2013
    Country/TerritoryAustralia
    CitySydney
    Period2/12/134/12/13

    Fingerprint

    Dive into the research topics of 'Inertial-kinect fusion for outdoor 3D Navigation'. Together they form a unique fingerprint.

    Cite this