Characterizing and Interpreting Music Expressivity through Rhythm and Loudness Simplices
Abstract
Characterizing and interpreting expressivity in performed music remains an open problem. In this paper, we explore the novel representation of recorded performances of triple time music using a 2-simplex, a graphical representation used to visualize three-interval rhythms. We analyze the MazurkaBL dataset, which contains beat-level tempo and loudness data of over 2000 recorded performances of 46 Chopin Mazurkas. Mazurkas' triple time lends themselves well to the 2-simplex; the expressive features of each three-beat bar map directly to unique points in the 2-simplex. We extend the rhythm simplex designed for beat durations to the representation of loudness. Each recorded performance is thus reduced to a set of points in 2-simplices based on beat-level duration or loudness. We provide the transformation to convert three-interval information to points in the 2-simplex; prove that inter-beat intervals and tempo representations in the 2-simplex are equivalent when timing variations are small; and, explain how smoothing the data impacts the coordinates of the points in the simplex. We demonstrate that the use of simplices can facilitate the analysis and interpretation of expressive music features; the method enables the identifying of bars with notable expressive variations such as temporal suspensions that form tipping points, and characterizing of performance regularity.
Origin | Files produced by the author(s) |
---|