CoSC: Shaker

This mobile web application has been developed for the Collective Sound Checks that took place in the Studio 13/16 at the Centre Pompidou.

In this scenario, players can record arbitrary percussive sounds (with their voice or using props) using a microphone (and a foot pedal). Once the recording is finished, the players can load the recorded sound on their mobile devices and perform it by shaking the devices. All the devices are beat synchronized to a steady tempo (16th beats on 100 BPM) so that multiple players can easily perform together.

Each sound recording is analysed on the server and segmented into percussive elements that are classified by their intensity. On the mobile device, the concatenative synthesizer generates a sound on each beat. Each sound is selected according to the device motion intensity: the synth plays soft sound segments when the player shakes the device softly, and louder segments when the player shakes the device more vigorously.

The players are encouraged to record phrases with percussive elements of a wide dynamic range. They can experiment with different sound recordings and create an ensemble by recording complementary materials.

CoSC: Matrix

This mobile web application has been developed in the context of Collective Sound Checks with the Studio 13/16 at the Centre Pompidou.

The players are sitting on a grid (for instance, 3 rows by 4 columns, for a total of 12 people). Their mobile devices form a matrix of screens and loudspeakers that are used to spatialize sound and light.

For now, the Matrix is performed by one player at a time: a representation of the matrix appears on the screen of a player who becomes the performer: by moving his finger on the matrix (on the screen), he controls from which smartphone(s) the light and sound come from in the real world. (The sound changes with the speed of the finger trajectory.) That way, he remotely uses the other people’s instruments. After a fixed time, another player takes over the control of the sound and light, and becomes the new performer.

The video below gives an idea of the technical setup. While the players are usually seated at a distance of 1 or 2 meters from each other, the smartphones are spaced by a few centimeters only for the purpose of this video.

The sound is generated locally on the mobile devices that are connected to a Web Socket server (using node.js and socket.io). The server receives the position from the performer’s device and controls the sound generators of all devices of the matrix.

CoSC Web Applications

For the Collective Sound Checks with the Studio 13/16 at the Centre Pompidou we have developed a series of web applications for mobile devices. The applications are based on web technologies such as HTML 5, JavaScript, Web Audio API, and Web Sockets.

The first set of applications we developed are a few gadgets that produce sound depending on the device’s motion. The gadgets can be played individually or with a group of player and allow¬†for exploring different, techniques, sound materials, and metaphors. The drone, birds, monks and the rainstick are described below.

In addition to these gadgets we have experimented with collaborative scenarios that are described in separate posts:

The gadgets and We Will Rock You: Reloaded application have been published at http://cosima.ircam.fr/checks (the applications work on mobile devices and require at least iOS 6 or Android 4.2).


Drone

The drone reacts on the device rotation and responds with the amplitude and frequency modulation of a set of oscillators generating a bass drone. Strongly shaking the device generates an electric sound synthesized through granular synthesis.


Birds

Birds is a collection of bird sounds that are played by jiggling the device. Each player can try different bird calls. Two or more players can communicate through tweeting and create a forest-like atmosphere of distributed bird sounds answering to each other.


Monks

Monks features a short extract of the song ‚ÄúEarly Morning Melody‚ÄĚ from Meredith Monk’s Book of the Days and the recording of a Tibetan chant. Both extracts are performed through granular synthesis by tilting the device sidewards. A group of players can form a choir.


Rainstick

The rainstick is based on sound materials that have been created by Pierre Jodlowski and also used for the audio visual installation Grainstick produced at Ircam in 2010. The player has to hold your device horizontally and tilt it up and down like a rainstick to produce sound.