CoSiMa submitted a Work-in-Progress paper at the Tangible and Embedded Interfaces conference held at Stanford University in January 2015 (TEI’15). The paper Collective Sound Checks — Exploring Intertwined Sonic and Social Affordances of Mobile Web Applications describes the mobile-web scenarios we tested at the Centre Pompidou with the Studio 13/16, and explores how these new forms of musical expression strongly shift the focus of design from human-computer interactions towards the emergence of computer-mediated interactions between players based on sonic and social affordances of ubiquitous technologies.
We presented our work during the poster session, and we got a lot of attention from the conference attendees: people had a lot of fun playing with the CoSC Web Applications and We Will Rock You: Reloaded, were impressed by the work done, and are looking forward to the upcoming developments.
An interactive public installation with smartphones, Fête des Lumières, Lyon, décembre 2014
Overexposure is an interactive work bringing together a public installation and a smart phone application. On an urban square, a large black monolith projects an intense beam of white light into the sky. Visible all over the city, the beam turns off and on, pulsating in way that communicates rigor, a will to communicate, even if we don’t immediately understand the signals it is producing. On one side of the monolith, white dots and dashes scroll past, from the bottom up, marking the installation with their rhythm: each time one reaches the top of the monolith, the light goes off, as if the marks were emptying into the light. On a completely different scale, we see the same marks scrolling across the smartphone screens of the people in attendance, interacting with the work, following the same rhythm. Here, it is the flash of the smartphones that releases light in accordance with the coded language. Because these are in fact messages that are being sent—in Morse code, from everyone, to everyone and to the sky—and that we can read thanks to the super-titling that accompanies the marks. Using a smartphone, anyone can send a message, saying what they think and therefore presenting themselves, for a few moments, to everyone, to a community sharing the same time, the same rhythm. And we can take the pulse of an even larger community—on the scale of the city and in real time—through a map of mobile phone network use, which can be visualized on one side of the monolith or via smartphone.
From an individual device (smartphone) the size of a hand to a shared format on the scale of the city, a momentary community forms and transforms, sharing a space, a pace, the same data, following a type of communication whose ability to bring together through a sensory experience is more important than the meaning of the messages it transmits or their destination, which is lost in the sky.
(Photos: Samuel Bianchini)
An Orange/EnsadLab project
A project under the direction of Samuel Bianchini (EnsadLab), in collaboration with Dominique Cunin (EnsadLab), Catherine Ramus (Orange Labs/Sense), and Marc Brice (Orange Labs/Openserv), in the framework of a research partnership with Orange Labs
“Orange/EnsadLab” partnership directors: Armelle Pasco, Director of Cultural and Institutional Partnerships, Orange and Emmanuel Mahé, Head of Research, EnsAD
Project Manager (Orange): Abla Benmiloud-Faucher
IT Development (EnsadLab): Dominique Cunin, Oussama Mubarak, Jonathan Tanant, and Sylvie Tissot
Mobile network data supply: Orange Fluxvision
Mobile network data processing: Cezary Ziemlicki and Zbigniew Smoreda (Orange)
SMS Server Development: Orange Applications for Business
Graphic Design: Alexandre Dechosal (EnsadLab)
In situ installation (artistic and engineering collaboration): Alexandre Saunier (EnsadLab)
Lighting and construction of the installation structure: Sky Light
Wireless network deployment coordination: Christophe Such (Orange)
Message moderators: Élodie Tincq, Marion Flament, Charlotte Gautier
Executive Production: EnsadLab
Research and development for this work was carried out in connection with the research project Cosima (“Collaborative Situated Media”), with the support of the French National Research Agency (ANR), and participates in the development of Mobilizing.js, a programming environment for mobile screens developed by EnsadLab for artists and designers
This application allows a group of players to perform Queen’s song “We Will Rock You” with a set of simple instruments and to create their own versions of the song. The players can choose between drums, voice solo, choirs, Freddy Mercury’s voice fill-ins (‘sing it’), a guitar power chord, and the final guitar riff.
While most of the instruments trigger segments of the original recordings when striking with the device in the air, the power chord and guitar riff resynthesize guitar sounds through granular synthesis.
The application has been published here (requires a mobile device under iOS 6 or later, or Android 4.2 or later with Chrome 35 or later).
In this scenario, players can record arbitrary percussive sounds (with their voice or using props) using a microphone (and a foot pedal). Once the recording is finished, the players can load the recorded sound on their mobile devices and perform it by shaking the devices. All the devices are beat synchronized to a steady tempo (16th beats on 100 BPM) so that multiple players can easily perform together.
Each sound recording is analysed on the server and segmented into percussive elements that are classified by their intensity. On the mobile device, the concatenative synthesizer generates a sound on each beat. Each sound is selected according to the device motion intensity: the synth plays soft sound segments when the player shakes the device softly, and louder segments when the player shakes the device more vigorously.
The players are encouraged to record phrases with percussive elements of a wide dynamic range. They can experiment with different sound recordings and create an ensemble by recording complementary materials.
The players are sitting on a grid (for instance, 3 rows by 4 columns, for a total of 12 people). Their mobile devices form a matrix of screens and loudspeakers that are used to spatialize sound and light.
For now, the Matrix is performed by one player at a time: a representation of the matrix appears on the screen of a player who becomes the performer: by moving his finger on the matrix (on the screen), he controls from which smartphone(s) the light and sound come from in the real world. (The sound changes with the speed of the finger trajectory.) That way, he remotely uses the other people’s instruments. After a fixed time, another player takes over the control of the sound and light, and becomes the new performer.
The video below gives an idea of the technical setup. While the players are usually seated at a distance of 1 or 2 meters from each other, the smartphones are spaced by a few centimeters only for the purpose of this video.
The sound is generated locally on the mobile devices that are connected to a Web Socket server (using node.js and socket.io). The server receives the position from the performer’s device and controls the sound generators of all devices of the matrix.
The first set of applications we developed are a few gadgets that produce sound depending on the device’s motion. The gadgets can be played individually or with a group of player and allow for exploring different, techniques, sound materials, and metaphors. The drone, birds, monks and the rainstick are described below.
In addition to these gadgets we have experimented with collaborative scenarios that are described in separate posts:
The gadgets and We Will Rock You: Reloaded application have been published at http://cosima.ircam.fr/checks (the applications work on mobile devices and require at least iOS 6 or Android 4.2).
The drone reacts on the device rotation and responds with the amplitude and frequency modulation of a set of oscillators generating a bass drone. Strongly shaking the device generates an electric sound synthesized through granular synthesis.
Birds is a collection of bird sounds that are played by jiggling the device. Each player can try different bird calls. Two or more players can communicate through tweeting and create a forest-like atmosphere of distributed bird sounds answering to each other.
Monks features a short extract of the song “Early Morning Melody” from Meredith Monk’s Book of the Days and the recording of a Tibetan chant. Both extracts are performed through granular synthesis by tilting the device sidewards. A group of players can form a choir.
The rainstick is based on sound materials that have been created by Pierre Jodlowski and also used for the audio visual installation Grainstick produced at Ircam in 2010. The player has to hold your device horizontally and tilt it up and down like a rainstick to produce sound.
An important aspect of CoSiMa is the experimentation of user scenarios, technologies, and content developed in the framework of the project with a community of users.
« Collective Sound Checks » are regular events that allow us to try out new developments with a larger number of users and to validate technological, aesthetic, and social hypotheses of our work. Each event proposes different experiences inviting users to play music together, or a game, or to discover augmented reality spaces.
The first CoSiMa Collective Sound Checks have been conducted in collaboration with the Studio 13/16.*
A first series of workshops a the Studio 13/16 happened in spring 2014 on May 14, May 28 and June 14 (for the Open House at Ircam). A second series followed in Fall/Winter 2014 on October 1, October 15, November 5, and December 17.
Orbe created Murmures Urbains, an emerging fiction built on the principle of post-narrative writing. Firstly, a staging based on protocols creates multiple situations outbreaks in the public space. Secondly, the traces and testimonies from these experiments are collected in a scenic area. In the epilogue course and stories are presented in an exhibition space.
Murmures Urbains is a rich context to experiment situated medias. The framework used for these experiments is a foreshadowing of the Cosima platform. With Medias-Situés, you can associate media with a combination of trigger constraints : space conditions, time, environmental or behavioral. The device also allows you to synchronize events between multiple mobile, manage spatial sound or hybridize remote areas. Murmures Urbains has been deployed in several contexts : workshops in art and design schools, such as ENJMIN (National School of Video Game and Interactive Media), during festivals and events like Chalon dans la Rue.
Murmures Urbains will be in residence at L’Hôpital Ephémère in April 2015 and will be presented at Chalon dans la Rue festival in July 2015.