Feature matching and pose estimation using newton iteration

Hongdong Li*, Richard Hartley

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    4 Citations (Scopus)

    Abstract

    Feature matching and pose estimation are two crucial tasks in computer vision. The widely adopted scheme is first find the correct matches then estimate the transformation parameters. Unfortunately, such simple scheme does not work well sometimes, because these two tasks of matching and estimation are mutually interlocked. This paper proposes a new method that is able to estimate the transformation and find the correct matches simultaneously. The above interlock is disentangled by an alternating Newton iteration method. We formulate the problem as a nearest-matrix problem, and provide a different numerical technique. Experiments on both synthetic and real images gave good results. Fast global convergence was obtained without the need of good initial guess.

    Original languageEnglish
    Title of host publicationImage Analysis and Processing - ICIAP 2005, 13th International Conference, Proceedings
    PublisherSpringer Verlag
    Pages196-203
    Number of pages8
    ISBN (Print)3540288694, 9783540288695
    DOIs
    Publication statusPublished - 2005
    Event13th International Conference on Image Analysis and Processing, ICIAP 2005 - Cagliari, Italy
    Duration: 6 Sept 20058 Sept 2005

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    Volume3617 LNCS
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349

    Conference

    Conference13th International Conference on Image Analysis and Processing, ICIAP 2005
    Country/TerritoryItaly
    CityCagliari
    Period6/09/058/09/05

    Fingerprint

    Dive into the research topics of 'Feature matching and pose estimation using newton iteration'. Together they form a unique fingerprint.

    Cite this