Abstract
We present and evaluate a novel interface for tracking ensemble performances on touch-screens. The system uses a Random Forest classifier to extract touch-screen gestures and transition matrix statistics. It analyses the resulting gesture-state sequences across an ensemble of performers. A series of specially designed iPad apps respond to this real-time analysis of free-form gestural performances with calculated modifications to their musical interfaces. We describe our system and evaluate it through cross-validation and profiling as well as concert experience.
Original language | English |
---|---|
Pages (from-to) | 359-364 |
Number of pages | 6 |
Journal | Proceedings of the International Conference on New Interfaces for Musical Expression |
Publication status | Published - 2015 |
Event | 15th International conference on New Interfaces for Musical Expression, NIME 2015 - Baton Rouge, United States Duration: 31 May 2015 → 3 Jun 2015 |