A physical intelligent instrument using recurrent neural networks

Torgrim R. Næss, Charles P. Martin

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

6 Citations (Scopus)

Abstract

This paper describes a new intelligent interactive instrument, based on an embedded computing platform, where deep neural networks are applied to interactive music generation. Even though using neural networks for music composition is not uncommon, a lot of these models tend to not support any form of user interaction. We introduce a self-contained intelligent instrument using generative models, with support for real-time interaction where the user can adjust high-level parameters to modify the music generated by the instrument. We describe the technical details of our generative model and discuss the experience of using the system as part of musical performance.

Original languageEnglish
Title of host publicationProceedings of the International Conference on New Interfaces for Musical Expression
Pages79-82
Number of pages4
Publication statusPublished - 2019
Externally publishedYes
Event19th International conference on New Interfaces for Musical Expression, NIME 2019 - Porto Alegre, Brazil
Duration: 3 Jun 20196 Jun 2019

Publication series

NameProceedings of the International Conference on New Interfaces for Musical Expression
PublisherInternational Conference on New Interfaces for Musical Expression
ISSN (Print)2220-4792

Conference

Conference19th International conference on New Interfaces for Musical Expression, NIME 2019
Country/TerritoryBrazil
CityPorto Alegre
Period3/06/196/06/19

Fingerprint

Dive into the research topics of 'A physical intelligent instrument using recurrent neural networks'. Together they form a unique fingerprint.

Cite this