Active gaze tracking for human-robot interaction

R. Atienza, A. Zelinsky

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    24 Citations (Scopus)

    Abstract

    In our effort to make human-robot interfaces more user-friendly, we built an active gaze tracking system that can measure a person's gaze direction in real-time. Gaze normally tells which object in his/her surrounding a person is interested in. Therefore, it can be used as a medium for human-robot interaction like instructing a robot arm to pick a certain object a user is looking at. We discuss how we developed and put together algorithms for zoom camera calibration, low-level control of active head, face and gaze tracking to create an active gaze tracking system.

    Original languageEnglish
    Title of host publicationProceedings - 4th IEEE International Conference on Multimodal Interfaces, ICMI 2002
    PublisherInstitute of Electrical and Electronics Engineers Inc.
    Pages261-266
    Number of pages6
    ISBN (Electronic)0769518346, 9780769518343
    DOIs
    Publication statusPublished - 2002
    Event4th IEEE International Conference on Multimodal Interfaces, ICMI 2002 - Pittsburgh, United States
    Duration: 14 Oct 200216 Oct 2002

    Publication series

    NameProceedings - 4th IEEE International Conference on Multimodal Interfaces, ICMI 2002

    Conference

    Conference4th IEEE International Conference on Multimodal Interfaces, ICMI 2002
    Country/TerritoryUnited States
    CityPittsburgh
    Period14/10/0216/10/02

    Fingerprint

    Dive into the research topics of 'Active gaze tracking for human-robot interaction'. Together they form a unique fingerprint.

    Cite this