Rone : Créatures & Cie @ Palais de Tokyo

Rone was invited to unveil the universe he created for his latest album Créatures at the Palais de Tokyo (Le Point Perché). Along with illustrations, photographs, binaural experiences and video games, CoSiMa presented Créatures & Cie – Collective Sound Check, a spontaneous collective performance that enables the audience to play with Rone’s creatures. By simply opening a web page, the visitors of the exhibition can discover a novel way of exploring Rone’s musical universe and fill in the space with his sound creatures. (from a smartphone)

This performance is a first step towards a new generation of interactive musical experiences that Rone is developing in collaboration with IRCAM.

Rone : Creatures & Cie


We Will Rock You: Reloaded

This mobile web application has been developed in the context of Collective Sound Checks with the Studio 13/16 at the Centre Pompidou.

This application allows a group of players to perform Queen’s song “We Will Rock You” with a set of simple instruments and to create their own versions of the song. The players can choose between drums, voice solo, choirs, Freddy Mercury’s voice fill-ins (‘sing it’), a guitar power chord, and the final guitar riff.

While most of the instruments trigger segments of the original recordings when striking with the device in the air, the power chord and guitar riff resynthesize guitar sounds through granular synthesis.

The application has been published here (requires a mobile device under iOS 6 or later, or Android 4.2 or later with Chrome 35 or later).

CoSC: Shaker

This mobile web application has been developed for the Collective Sound Checks that took place in the Studio 13/16 at the Centre Pompidou.

In this scenario, players can record arbitrary percussive sounds (with their voice or using props) using a microphone (and a foot pedal). Once the recording is finished, the players can load the recorded sound on their mobile devices and perform it by shaking the devices. All the devices are beat synchronized to a steady tempo (16th beats on 100 BPM) so that multiple players can easily perform together.

Each sound recording is analysed on the server and segmented into percussive elements that are classified by their intensity. On the mobile device, the concatenative synthesizer generates a sound on each beat. Each sound is selected according to the device motion intensity: the synth plays soft sound segments when the player shakes the device softly, and louder segments when the player shakes the device more vigorously.

The players are encouraged to record phrases with percussive elements of a wide dynamic range. They can experiment with different sound recordings and create an ensemble by recording complementary materials.

CoSC: Matrix

This mobile web application has been developed in the context of Collective Sound Checks with the Studio 13/16 at the Centre Pompidou.

The players are sitting on a grid (for instance, 3 rows by 4 columns, for a total of 12 people). Their mobile devices form a matrix of screens and loudspeakers that are used to spatialize sound and light.

For now, the Matrix is performed by one player at a time: a representation of the matrix appears on the screen of a player who becomes the performer: by moving his finger on the matrix (on the screen), he controls from which smartphone(s) the light and sound come from in the real world. (The sound changes with the speed of the finger trajectory.) That way, he remotely uses the other people’s instruments. After a fixed time, another player takes over the control of the sound and light, and becomes the new performer.

The video below gives an idea of the technical setup. While the players are usually seated at a distance of 1 or 2 meters from each other, the smartphones are spaced by a few centimeters only for the purpose of this video.

The sound is generated locally on the mobile devices that are connected to a Web Socket server (using node.js and The server receives the position from the performer’s device and controls the sound generators of all devices of the matrix.

CoSC Web Applications

For the Collective Sound Checks with the Studio 13/16 at the Centre Pompidou we have developed a series of web applications for mobile devices. The applications are based on web technologies such as HTML 5, JavaScript, Web Audio API, and Web Sockets.

The first set of applications we developed are a few gadgets that produce sound depending on the device’s motion. The gadgets can be played individually or with a group of player and allow for exploring different, techniques, sound materials, and metaphors. The drone, birds, monks and the rainstick are described below.

In addition to these gadgets we have experimented with collaborative scenarios that are described in separate posts:

The gadgets and We Will Rock You: Reloaded application have been published at (the applications work on mobile devices and require at least iOS 6 or Android 4.2).


The drone reacts on the device rotation and responds with the amplitude and frequency modulation of a set of oscillators generating a bass drone. Strongly shaking the device generates an electric sound synthesized through granular synthesis.


Birds is a collection of bird sounds that are played by jiggling the device. Each player can try different bird calls. Two or more players can communicate through tweeting and create a forest-like atmosphere of distributed bird sounds answering to each other.


Monks features a short extract of the song “Early Morning Melody” from Meredith Monk’s Book of the Days and the recording of a Tibetan chant. Both extracts are performed through granular synthesis by tilting the device sidewards. A group of players can form a choir.


The rainstick is based on sound materials that have been created by Pierre Jodlowski and also used for the audio visual installation Grainstick produced at Ircam in 2010. The player has to hold your device horizontally and tilt it up and down like a rainstick to produce sound.