À l’occasion de l’ouverture de la 36e édition du festival Montpellier Danse, EnsadLab, idscenes et le festival vous conviait à faire l’expérience d’un dispositif interactif collectif et immersif inédit les jeudi 23 juin de 13h à 17h, vendredi 24 et samedi 25 juin de 11h à 17h, salle Béjart Agora (18 Rue Sainte-Ursule, Montpellier).
Montpellier Danse a accueilli, dans la salle Béjart de l’Agora, l’événement Collective Mobile Mapping Montpellier, réunissant les œuvres de Dominique Cunin et Christophe Domino sous forme de sessions alternées et complémentaires. Espace Puissance Espace, de Dominique Cunin, trouve son origine dans la projection immersive du modèle 3D d’un espace architectural sur lui-même.
Les spectateurs sont invités à contrôler collectivement cette grande image à l’aide de leurs smartphones.
Les murs bougent, on entre dans l’épaisseur et l’intimité du bâtiment. Centon Digital est un jeu de sens et de lecture, où le joueur sélectionne des mots ou des séquences de mots qui s’additionnent pour former un texte projeté. Tous les murs deviennent écran, la frontalité d’une projection « classique » disparaît pour laisser la place à une immersion du spectateur dans la projection textuelle, à une interactivité entre les joueurs eux-mêmes et avec la Salle Béjart.
Un projet idscènes, organisé par EnsadLab, le laboratoire de l’École nationale supérieure des Arts Décoratifs—Paris, Grande Image Lab, ESBA TALM-Le Mans et avec le concours de l’Esbama Montpellier.
CoSiMa has participated at the Sonar+D international conference on creativity and technology of the Sonar music festival in Barcelone with a Sonar Innovation Challenge. A team of 5 musicians, designers, and developers – formed over a month before the event – has worked two and a half days on a music application dedicated to a public interacting collaboratively through their smartphones. The web-based application has been developed with the Soundworks framework.
The resulting application is Weather, a performance for a DJ and a public participating through their smartphones. As usual in performances based on the Soundworks framework, participants connect their smartphones to the local CoSiMa Wi-Fi network and visit the web page of the Weather application. Once connected to the application, the participants can play with four gestures to switch between different weather states that are associated to different sound textures and visualizations generated on their mobile devices: (1.) Touching the screen generates the bird chirps of a sunny afternoon, (2.) swaying and tilting the device generates wind, (3.) shaking it softly generates a rain sound and rain drops on the screen, and (4.) shaking it harder generates thunder sounds and lightning on screen.
The sound generated by the participants creates a sound textures distributed over the audience. The current weather states of all clients are collected on the server to generate a weather profile that controls visuals on a public display and environmental sounds on the PA. In addition, the weather profile is interpreted by a DJ playing live electronic music in dialog with the audience’s sound textures.
The five CoSiMa SIC challengers who developed the Weather performance are Matthew Bethancourt, Andrés Ferraro, JP Carrascal, Chaithanya Jade, and Yuli Levtov.
CoSiMa participated at the Music Tech Fest in Berlin with a workshop « Hack the Audience » featuring the Soundworks framework. In two days, mai 26 and 27, we developed two performances in which the audience participates with their smartphones : MTF Orgy and GrainField. In both performances, the audience connects their smartphones to the CoSiMa Wi-Fi network and visits a given webpage to participate.
In MTF Orgy, each participant controls the intensity and detuning of two harmonics of a distributed additive synthesizer – the Orgy organ – by tilting their smartphone. The lower harmonics are generated on the PA and the higher ones on the participant’s mobile devices. A musician on stage plays chords on a MIDI keyboard that determine the fundamental frequencies. Other musicians can join the performance. At the MTF performance, we were accompanied by Steve Lawson on the bass.
In GrainField the smartphones enable the participants to play with the granular synthesis of 2 secs of sound recorded from a percussionist sitting in the middle of the audience (see images below). The system records every second 2 secs of sound that are send to the smartphones of the audience so that the sound a participant plays with changes every 8 seconds. The sound generated by the participant’s smartphones can be perceived as a distributed granular echo of the percussionist’s performance without any other amplification.
In addition, we presented the CoSiMa project in a brief talk and performance with the audience playing birds and drops on their smartphones.
CoSiMa is represented in the Electrosound exhibition at the EDF Foundation with two small installations. One documents the participative concert Chloé ⨉ Ircam and the other is a downscaled version of Collective Loops.
The installation that refers to the participative concert we created with Chloé, is based on the same technical setup and music track as the Terminal installation and features 14 wall mounted smartphones. As in the concert and the Terminal installation, the public can connect to the installation to participate using their mobile devices.
Collective Loops is shown in a reduced version with eight smartphones mounted on desk and a screen that replaces the floor projection.
The exhibition also included the MO – Musical Objects that have been developed in the framework of one of CoSiMa’s predecessor projects, Interlude.
The participative concert Chloé ⨉ Ircam and the interactive installation Terminal, both created in the framework of CoSiMa, have been presented in the documentary « Le Future de la Musique » on the television program PLANÈTE+. The documentary includes interviews with Chloé and Norbert Schnell as well as footage from a working session with Chloé and Ircam’s CoSiMa team in a studio at Ircam.
Here is the trailer of the episode that features the CoSiMa projects :
A second version of the Collective Loops installation has been shown during the Ircam Forum Workshops on November 25 and 26, 2015.
The installation features a collaborative version of a step sequencer that uses the visitor’s smartphones to produce sound. The sequencer is graphically represented by a circle of 8 sectors projected on the floor. The sectors light up in a clockwise motion following the beats of the sequence.
When the players connect to the installation through a web page, they choose an available sector, and thus, their step in the loop. The players can control the sounds (i.e. notes in a melody or bass line and percussion sounds) that are played on their smartphone at the corresponding beat of the sequence through a simple graphical interface on their smartphones. The selected sounds are also displayed in the corresponding sector of the circle on the floor.
Positioned around the circle, the players collaborate on creating melodies and rhythm patterns rendered through their smartphones.
The application uses a first complete version of the CoSiMa platform entirely based on web standards.
Design and development : Ircam: Norbert Schnell, Jean-Philippe Lambert, Benjamin Matuszewski, Sebastien Robaszkiewicz Orbe: Xavier Boissarie, Florent Dubois, Gregory Cieslik, Tomek Jarolim, Quentin Levigneron EnsadLab: Samuel Bianchini, Dominique Cunin, Oussama Mubarak, Jonathan Tanant ID Scènes: Christophe Aubry, Fabrice Auchere NoDesign: Jean-Louis Frechin, Uroš Petrevski ESBA TALM: Christophe Domino
Norbert Schnell presented the CoSiMa project at Ableton’s Loop Summit for Music Makers. During the presentation, the audience performed with various CoSiMa prototype applications using their smartphones.
Terminal is an interactive installation that has been created in collaboration with Chloé and the Scale collective for the Paris Musique Club. The installation will be shown from October 24, 2015 to January 31, 2016 at the Gaité Lyrique.
The project transposes the musical elements and mobile interactions of the Chloé ⨉ Ircam concert into the situation of an exhibition.
The installation features a looped 15-minutes 4-channel music track staged in a 7-meters corridor with 21 smartphones aligned along the wall and luminous lines running on the floor.
Similar as in the concert, visitors can connect to the installation with their mobile devices to participate. At given passages of the music track, the participants are invited to play sound with touch and motion interfaces that appear on their mobile device. The graphical animations and sound of their device are echoed by one of the smartphones on the wall.
Every now and then, waves of sound textures appear on the participants’ mobile devices. In addition, visitors can use a wall-mounted tablet to distribute sound textures over the smartphones on the wall. The light on the floor reacts on the music as well as the visitors’ interactions with the tablet.
This video summarizes the concert from a rather technical point of view.
Like at the Fete de la musique, the audience participates in this concert by connecting their smartphones to the local Wi-Fi network Chloe × Ircam and by opening the web page chloe.ircam.fr in their browser. Once connected, the participants are asked to indicate their approximate position on map of the concert space. During the concert, Chloé can move sounds over the audience’s smartphones – using four tablets integrated into her setting – and let appear dedicated sound interfaces on the touchscreens. The concert starts and ends with everybody playing with Chloé’s whispering voice.
The Orbe collective has presented three experimental scenarios of augmented soundwalks. The participants equipped with a smartphone are invited to experience an augmented audio reality that reacts on their position, trajectory and movement (using GPS, BTLE beacons and motion sensors). Each scenario proposes a different narrative and leads the participants on different possible trajectories through the same district of Chalon-sur-Saône. The trajectories take between 30 minutes and 2 hours depending on the participants’ preference and their engagement with the proposed activities.
The trajectories of all participants have been recorded and visualized on a screen at the arrival point where the team invited the participants to a debriefing of their experience.