City Research Online

Generating Time: Rhythmic Perception, Prediction and Production with Recurrent Neural Networks

Elmsley (né Lambert), A., Weyde, T. & Armstrong, N. (2017). Generating Time: Rhythmic Perception, Prediction and Production with Recurrent Neural Networks. Journal of Creative Music Systems, 1(2), doi: 10.5920/jcms.2017.04

Abstract

In the quest for a convincing musical agent that performs in real time alongside human performers, the issues surrounding expressively timed rhythm must be addressed. Current beat tracking methods are not sufficient to follow rhythms automatically when dealing with varying tempo and expressive timing. In the generation of rhythm, some existing interactive systems ignore the pulse entirely, or fix a tempo after some time spent listening to input. Since music unfolds in time, we take the view that musical timing needs to be at the core of a music generation system.

Our research explores a connectionist machine learning approach to expressive rhythm generation, based on cognitive and neurological models. Two neural network models are combined within one integrated system. A Gradient Frequency Neural Network (GFNN) models the perception of periodicities by resonating nonlinearly with the musical input, creating a hierarchy of strong and weak oscillations that relate to the metrical structure. A Long Short-term Memory Recurrent Neural Network (LSTM) models longer-term temporal relations based on the GFNN output.

The output of the system is a prediction of when in time the next rhythmic event is likely to occur. These predictions can be used to produce new rhythms, forming a generative model.

We have trained the system on a dataset of expressively performed piano solos and evaluated its ability to accurately predict rhythmic events. Based on the encouraging results, we conclude that the GFNN-LSTM model has great potential to add the ability to follow and generate expressive rhythmic structures to real-time interactive systems

Publication Type: Article
Publisher Keywords: Music perception, rhythm generation, machine learning, neural networks, expressive timing
Subjects: M Music and Books on Music > M Music
Q Science > QA Mathematics > QA75 Electronic computers. Computer science
Departments: School of Communication & Creativity > Performing Arts > Music
School of Science & Technology > Computer Science
SWORD Depositor:
[thumbnail of JCMS - Vol 1 Issue 2.pdf]
Preview
Text - Published Version
Available under License Creative Commons Attribution.

Download (21MB) | Preview

Export

Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Downloads

Downloads per month over past year

View more statistics

Actions (login required)

Admin Login Admin Login