Visual gesture interfaces for virtual environments

R. G. O'Hagan*, A. Zelinsky, S. Rougeaux

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    57 Citations (Scopus)

    Abstract

    Virtual environments provide a whole new way of viewing and manipulating 3D data. Current technology moves the images out of desktop monitors and into the space immediately surrounding the user. Users can literally put their hands on the virtual objects. Unfortunately, techniques for interacting with such environments are yet to mature. Gloves and sensor-based trackers are unwieldy, constraining and uncomfortable to use. A natural, more intuitive method of interaction would be to allow the user to grasp objects with their hands and manipulate them as if they were real objects. We are investigating the use of computer vision in implementing a natural interface based on hand gestures. A framework for a gesture recognition system is introduced along with results of experiments in colour segmentation, feature extraction and template matching for finger and hand tracking, and simple hand pose recognition. Implementation of a gesture interface for navigation and object manipulation in virtual environments is presented.

    Original languageEnglish
    Pages (from-to)231-250
    Number of pages20
    JournalInteracting with Computers
    Volume14
    Issue number3
    DOIs
    Publication statusPublished - Apr 2002

    Fingerprint

    Dive into the research topics of 'Visual gesture interfaces for virtual environments'. Together they form a unique fingerprint.

    Cite this