Abstract
CalmusWaves is the first live performance project of its kind where choreography, music and lighting settings are created together in real-time. During CalmusWaves performances, dancers-connected to the composition program CALMUS through wireless OSC1 sensors-interact with musicians who sight-read graphical notation appearing simultaneously on iPads via the CALMUS Notation a pp. Further input is given by light technicians and computer programmers. This process allows the participants to control and modify the development of a real-time composition through their onstage performances [1]. The principal aim of this research project was to investigate the mapping of the dancers' gestures and movements to control the real-time musical composition. Other research topics included how the data deriving from the wireless sensors could be converted, filtered and truncated into data allowing CALMUS to understand and react to the stimuli in a musical way. The purpose of this paper is to describe the technical and artistic processes involved in our project.
Original language | English |
---|---|
Pages (from-to) | 448-451 |
Number of pages | 4 |
Journal | Writing in Practice |
Volume | 3 |
Publication status | Published - 2017 |