Biologically-inspired time and location of impact prediction from optical flow

Chris McCarthy*, Giorgio Metta

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    1 Citation (Scopus)

    Abstract

    We investigated the use of optical flow to predict the time and location of impact of an incoming object. By examining local patterns of optical flow, we make predictions on an object's trajectory with respect to a stationary observer, and its time-to-contact with the observer's (assumed planar) body. Such a cue may serve as inputs to motor responses for reactive threat avoidance, or object interception (such as catching a ball). The presented approach is based on the observed behaviour of neurons in the F4 region of the pre-motor cortex of primates and it is part of a larger project which aims at modelling multi-sensory neurons and their contribution to reaching behaviour on the iCub humanoid platform. We present preliminary experimental results of a computation model of F4's visual responses using real image sequences acquired from the robot, and demonstrate their application in a real-time threat detection/prediction system.

    Original languageEnglish
    Title of host publication2011 IEEE International Conference on Robotics and Automation, ICRA 2011
    Pages6199-6204
    Number of pages6
    DOIs
    Publication statusPublished - 2011
    Event2011 IEEE International Conference on Robotics and Automation, ICRA 2011 - Shanghai, China
    Duration: 9 May 201113 May 2011

    Publication series

    NameProceedings - IEEE International Conference on Robotics and Automation
    ISSN (Print)1050-4729

    Conference

    Conference2011 IEEE International Conference on Robotics and Automation, ICRA 2011
    Country/TerritoryChina
    CityShanghai
    Period9/05/1113/05/11

    Fingerprint

    Dive into the research topics of 'Biologically-inspired time and location of impact prediction from optical flow'. Together they form a unique fingerprint.

    Cite this