Tracking ensemble performance on touch-screens with gesture classification and transition matrices

Charles Martin, Henry Gardner, Ben Swift

Research output: Contribution to journalConference articlepeer-review

8 Citations (Scopus)

Abstract

We present and evaluate a novel interface for tracking ensemble performances on touch-screens. The system uses a Random Forest classifier to extract touch-screen gestures and transition matrix statistics. It analyses the resulting gesture-state sequences across an ensemble of performers. A series of specially designed iPad apps respond to this real-time analysis of free-form gestural performances with calculated modifications to their musical interfaces. We describe our system and evaluate it through cross-validation and profiling as well as concert experience.

Original languageEnglish
Pages (from-to)359-364
Number of pages6
JournalProceedings of the International Conference on New Interfaces for Musical Expression
Publication statusPublished - 2015
Event15th International conference on New Interfaces for Musical Expression, NIME 2015 - Baton Rouge, United States
Duration: 31 May 20153 Jun 2015

Fingerprint

Dive into the research topics of 'Tracking ensemble performance on touch-screens with gesture classification and transition matrices'. Together they form a unique fingerprint.

Cite this