Data-driven analysis of tiny touchscreen performance with MicroJam

    Research output: Contribution to journalArticlepeer-review

    1 Citation (Scopus)

    Abstract

    The widespread adoption of mobile devices, such as smartphones and tablets, has made touchscreens a common interface for musical performance. Although new mobile music instruments have been investigated from design and user experience perspectives, there has been little examination of the performers’ musical output. In this work, we introduce a constrained touchscreen performance app, MicroJam, designed to enable collaboration between performers, and engage in a data-driven analysis of more than 1,600 performances using the app. MicroJam constrains performances to five seconds, and emphasizes frequent and casual music-making through a social media–inspired interface. Performers collaborate by replying to performances, adding new musical layers that are played back at the same time. Our analysis shows that users tend to focus on the center and diagonals of the touchscreen area, and that they tend to swirl or swipe rather than tap. We also observe that, whereas long swipes dominate the visual appearance of performances, the majority of interactions are short with limited expressive possibilities. Our findings enhance our understanding of how users perform in touchscreen apps and could be applied in future app designs for social musical interaction.

    Original languageEnglish
    Pages (from-to)41-57
    Number of pages17
    JournalComputer Music Journal
    Volume43
    Issue number4
    DOIs
    Publication statusPublished - 1 Oct 2020

    Fingerprint

    Dive into the research topics of 'Data-driven analysis of tiny touchscreen performance with MicroJam'. Together they form a unique fingerprint.

    Cite this