papers AI Learner
The Github is limit! Click to go to the new site.

An Interactive Musical Prediction System with Mixture Density Recurrent Neural Networks

2019-04-10
Charles P Martin, Jim Torresen

Abstract

This paper is about creating digital musical instruments where a predictive neural network model is integrated into the interactive system. Rather than predicting symbolic music (e.g., MIDI notes), we suggest that predicting future control data from the user and precise temporal information can lead to new and interesting interactive possibilities. We propose that a mixture density recurrent neural network (MDRNN) is an appropriate model for this task. The predictions can be used to fill-in control data when the user stops performing, or as a kind of filter on the user’s own input. We present an interactive MDRNN prediction server that allows rapid prototyping of new NIMEs featuring predictive musical interaction by recording datasets, training MDRNN models, and experimenting with interaction modes. We illustrate our system with several example NIMEs applying this idea. Our evaluation shows that real-time predictive interaction is viable even on single-board computers and that small models are appropriate for small datasets.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1904.05009

PDF

http://arxiv.org/pdf/1904.05009


Comments

Content