Integrated Multimodal Score-following Environment (IMuSE):
The Integrated Multimodal Score-following Environment (IMuSE) is a research project funded by
the Social Sciences and Humanities Research Council (SSHRC) starting in 2009.
IMuSE is a software environment based on the NoteAbilityPro music editor and the IIMPE system developed by
Keith Hamel and Bob Pritchard. With IMuSE, score-following is expanded from audio pitch-tracking to also include a variety of
multi-channel gesture data. Live performance gestures can come from any kind of single or multi-channel continuous data from standard controllers,
accelerometers, movement trackers, video trackers, live audio analysis etc. This data is matched against pre-recorded gesture data which has been stored in the
NoteAbilityPro score as multi-channel break-point functions.
Polyphonic pitch-tracking is performed using the antescofo score-following software developed by
Arshia Cont and gesture-tracking is performed using the gf software developed by Frederic Bevilacqua
and Bruno Zamborlin. IMuSE negotiates between tracking data received from multiple sources
and adjusts score playback based on an assessment of which data is most reliable at a given point in the performance.

An example of a NoteAbilityPro score containing the score to be played by a live performer, MaxMSP control messages and gestures recorded from a 3-axis accelerometer.
Video Demonstrations:
Touch for Piano and Interactive Electronics [ Masaki - piano ]
In the performance below, the motion of the pianists hands are used both to synchronize the performance to the score and to generate electroacoustic events. The pianist (Megumi Masaki)
plays from the score (which includes information about how the hands are moved above the keyboard) while a NoteAbilityPro score is running on a separate computer. The piano
sounds are processed and a variety of other samples are generated during the performance. You can see the camera mounted directly above the piano keyboard.
Touch for Piano and Interactive Electronics [ Hamm - piano ]
Another performance of Touch, this one by Corey Hamm. The hand gestures are different from the
performance above and one can see how the performance can be personalized to the gestural movements of different performers.
Gesture following using IMuSE
In the video demonstrations below, the motion of the pianists hands are used to synchronize his performance to the score he is performing. In order to do this,
a camera was mounted above the piano keys, the movement of the hands was analysed during a rehearsal, and a graph of the hand movement was aligned to the music in the score.
During live performances, the motion of the hands is again tracked using the camera and these movements are compared against the recorded graphs of hand motion.
This information is used to synchronize the live performance to the score - no pitch tracking is used in this demonstration -- only the movements of the hands are used to follow the score.
A detailed explanation of how gesture following is implemented by IMuSE:
A live demonstration of gesture following:
IMuSE System Description
Example 1 - Hindemith: Der Schwanendreher
Example 2 - Partos: Yizkor (In Memoriam)
Example 3 - Bach: Prelude from Cello Suite IV
Example 4 - Gubaidulina: Quasi Hoquetus
Example 5 - Gubaidulina: Quasi Hoquetus using Computer Vision
Complete video with all examples
Papers and Documentation:
Ritter, M., Hamel, K. Pritchard, R "Integrated Multimodal Score-Following
Environment" in
Proceedings of the International Computer Music Conference, 2103. Perth, Australia. pp. 185-192
Hamel, K. Network Port Panel in NoteAbilityPro in
NoteAbilityPro Reference Manual Version 2.350, 2008.
Hamel, K. Using Breakpoint Functions in NoteAbilityPro in
NoteAbilityPro Reference Manual Version 2.350, 2008.
Hamel, K. Antescofo Support in NoteAbilityPro in
NoteAbilityPro Reference Manual Version 2.350, 2008.
|