Instant Composition

Performance art
Contemporary dance
Biometric data
Spatial data
Machine learning
Live coding
Work in progress
Instant Composition is the art of composing in the moment. In dance, an instant composition is an improvised piece created by one or more dancers while performing on stage. Composing in the moment requires a strong presence and awareness of the dancer while reading and responding to the other dancers.
As a dancer, I love the level of expression that’s possible through improvisation and instant composition. And, as an artist and creative technologist, I’m very much interested in augmenting this experience by opening up the composition to musicians, visual artists, and computer algorithms. This installation is under development. I learn, build, and improve by doing artistic research and I'm open to collaboration with other artists.
For this work, I’m creating an open space where real-time data of the performance can be used instantly by one or more performers and computer algorithms to drive interactive and/or live coded music compositions and visuals.
Performance data is collected wirelessly via sensors measuring the conditions and occupance of the space and non-obstructive wearable sensors on the body of the dancers measuring properties such as orientation and heartbeat. This spatial and biometrical data can be used directly or it can be combined with other data to infer properties such as the dancer’s emotions, Laban features, and the state of the space.
Distributed computing forms the basis of the installation. All sensor data is collected and preprocessed on independent devices and is broadcasted wirelessly over a dedicated WiFi network. Performers and computer algorithms only need access to this network to read sensor data and participate in the composition.
All sensor data is broadcasted as OSC messages over UDP to the entire network. Any device or computer on the network can listen for these OSC messages and process them in their own unique way. For instance, by adding music or visuals to the performance, or by combining, deriving, and broadcasting new data over the network.
Performers can use any type of software that supports OSC and/or MIDI. Some examples are Processing, TouchDesigner, Ableton Live and Max for Live, Pure Data, Modul8, and SuperCollider. Live coding on stage can be introduced by using software such as TidalCycles and Sonic Pi. The computer algorithms used for deriving new data can either be programmed or trained using machine learning.