
Really thrilled to have my first journal paper linking music rhythms to performed timings and arrhythmia published online—url: journals.sagepub.com/doi/full/10.1177/2059204318795159—in the newly established Music & Science journal edited by Professor Ian Cross. This article documents many of the ideas underlying the ERC ADG COSMOS: Computational Shaping and Modeling of Musical Structures.
I am grateful to Ian Cross for kindly shepherding the paper through the review process, and to the reviewers Professor Ian Pace of City, University of London, and Professor Jonathan Berger of Stanford University, and an anonymous reviewer for their thoughtful comments that have made this paper a much better one than when it started.
Chew, E. (2018). Notating disfluencies and temporal deviations in music and arrhythmia. Music & Science, Volume 1, first published online September 24, 2018, pp. 1-22.
Abstract: Expressive music performance and cardiac arrhythmia can be viewed as deformations of, or deviations from, an underlying pulse stream. I propose that the results of these pulse displacements can be treated as actual rhythms and represented accurately via a literal application of common music notation, which encodes proportional relations among duration categories, and figural and metric groupings. I apply the theory to recorded music containing extreme timing deviations and to electrocardiographic (ECG) recordings of cardiac arrhythmias. The rhythm transcriptions are based on rigorous computer-assisted quantitative measurements of onset timings and durations. The root-mean-square error ranges for the rhythm transcriptions were (19.1, 87.4) ms for the music samples and (24.8, 53.0) ms for the arrhythmia examples. For the performed music, the representation makes concrete the gap between the score and performance. For the arrhythmia ECGs, the transcriptions show rhythmic patterns evolving through time, progressions which are obscured by predominant individual beat morphology- and frequency-based representations. To make tangible the similarities between cardiac and music rhythms, I match the heart rhythms to music with similar rhythms to form assemblage pieces. The use of music notation leads to representations that enable formal comparisons and automated as well as human-readable analysis of the time structures of performed music and of arrhythmia ECG sequences beyond what is currently possible.
[ journal | html | pdf ]
![]() |
Figure 15. ECG and transcription of atrial fibrillation excerpt Thu 17-38-26 VT 4 beats 200 beats/min (Summary of event) 1 min HR 105 beats/min. |