portfolio

This is a portfolio created for presenting personal and client works engaging with data sets. The documentation features in-depth description of technological components and links to relevant code on GitHub.


Personal works


Subjective Cartography I (City?) 


Website[link]
Year of completion: 2013
Type: Generative audiovisual installation connected to a weather API.

General description: An urban-like architecture constrains the movement of particles, much like the movement of humans inside a city. An imaginary city in continuous transformation where graphic components are audio-reactive and also the motor of sonic actions. Images and synthetic sounds are rendered in real time by an algorithm fed with meteorological data. The speed and direction of the wind, the humidity and the temperature felt in three randomly selected cities will generate transformations that take place over several days and invite passive contemplation.

Description of technological components: The installation is made of two applications running in parallel. The first one, written in C++ (openFrameworks), is a visual particle system mimicking urban traffic. The physical model managing the behaviour of particles is itself a particle system connected to meteorological data. Every five minutes, a request is made to an online weather API and several data type such as temperature, wind speed, wind direction, barometric pressure etc. are parsed in JSON format. In total, seven types of data are explicitly mapped to the particle system global parameters. Then the data is sent via OSC protocol to the second application written in SuperCollider running in parallel. This second application takes care of sound through a granular synthesis algorithm whose details can be found here. These two parallel algorithms change very slowly over time and display different configurations that can take several days to form. This is indeed a very contemplative work. Something about melancholy and the passage of time.

Relevant code available here: [link]

  • Cartographies Subjectives (vidéo). - Crédit photo: P. Saint-Denis

Subjective Cartography II (Street View) 



Website[link]
Year of completion: 2018
Type: Robotic interactive installation.
Support: Conseil des arts et des lettres du Québec, Canada Council for the Arts

General description: An installation made of a TV screen and an accordion robot choir. Images of suburban streets are displayed on screen, analyzed by computer and transposed to sound on accordions. The skyline generates a melody that follows the contours of trees and suburban architecture.

Description of technological components: This installation gives some agency to the public. A graphical interface sitting on an iPad enables visitors to choose between five trajectories in Brossard (10-30, Boulevard Tachereau, random streets, etc.). The trajectory is then displayed on a TV screen where an application written in C++ (openFrameworks) finds where the skyline is (where the first pixel is not blue starting from the top)  in regards to height of the canvas. The data is then sent via OSC to the micro-controllers sitting in each accordion base platform. This data is then sent to a solenoid driver in DMX format in order to map the skyline to the accordion white keys.

There is a rich history of transposing landscape to music. For one reason or another, I am not trying here to engage in a pastoral way with landscape. The landscape that is mine, I mean the one I grew up in, is that of the North American suburbs, consumerism and empty parking lots. The interactions make it possible to establish symbolic connections and to fuel those endless melismas with humour and self-mockery. Littles houses, little melodies.

Relevant code available here: [link]

  • Cartographies Subjectives (accordéons) @ FTRL. - Crédit photo: P. Saint-Denis

You Mean It’s Just Sounds? 



Website[link]
Year of completion: 2016
Type: Biometric audio performance
Duration : 15 minutes
Premier : 5 March 2016. In Sonora Festival, Madrid, ES
Nominations: Arte Laguna Festival (Venice, IT, 2015)
Support: Conseil des arts et des lettres du Québec, Canada Council for the Arts

General description: In this performance I use two sensors developed by Thalmic Labs to capture the orientation and electrical activity of the muscles in my forearms. Through custom software, my gestures are transposed to sound in a choreography that stands somewhere between the theremin and the air guitar.

On another level, the project is invested in recreating within the digital context a gesture-sound relationship inspired by the one present in traditional instruments. For many years, gesture and sound were closely linked together and bounded in a certain way by the physicality of musical instruments. It is first with the democratization of electricity that the energy to produce sound began to detach itself from the sound output. In instruments like the Theremin, the Hammond organ or the electric guitar to state a few, musicians only “control” sound while the energy to produce it comes from electricity. This separation is even more obvious in the digital context where gestures are translated to data, interpreted by a computer and then rendered to sound or to mechanical instruments.

When I build musical instruments, I usually try to wander away from the gesture-sound relationship coming from tradition. I try to build new original instruments that are works in themselves. I aim at blurring the frontiers between the work and the instrument. However, this project is different because it features a more traditional approach to gesture, sound and performance. It feeds on works by great artists such as David Rokeby (Very Nervous System), Atau Tanaka (BioMuse), Thierry de Mey (Light Music) and many others who are invested in contactless gestural controllers or, in a larger context, in using the body as a musical instrument.

Description of technological components: The performance is based on two applications running in parallel. The data from the interface (Thalmic lab’s Myo) is sent first to an openFrameworks app that transforms quaternions to Euler angles and displays a 3D model of my hands for visual feedback. Then the data is sent through OSC to SuperCollider in order to control three synths and trigger algorithmic sound events involving stochastic (random weights). All mappings are explicit, some are divergent or direct.

Relevant code available here: [link]

  • You Mean It's Only Sounds? @ Arsenal. -Crédit photo: Arsenal

Works for clients


Isotopp (Herman Kolgen)



Client: Herman Kolgen
Website[link]
Year of completion: 2018
Type: Installation and performance

General description: Based on two years of collaboration with the researchers Jean-Charles Thomas and Thomas Roger, Herman Kolgen has conceived a system linked to an interpretation of the GANIL research activities for a visual and sound performance and installation. On stage, the system is linked in real time with the nuclear center. Several prototypes have been made in Montreal and Europe before the first public appearance with the scientists on may 12th at the Cargo center. The GANIL proceeds to study this phenomenon by colliding nuclei at very high speed. The basis is to crash beam at 100 000 km/sec of heavy ions (elements whose the weight is located between the atom of Carbon and the atom of Uranium) into the nucleus targets. The effects of this bombardment are recorded by sensors, then studied in detail by the researchers. Herman Kolgen conceptualizes a vision which aims to express the research lead by the GANIL, transposing this knowledge based on real data with the purpose to create a visual and sound dynamic substance.

Description of technological components: The GANIL center have opened a connection to their sensor data and are sending PHP post request to a server. The data is then fed (PHP request) into a C++ (openFrameworks) application. The data is then connected to different visual algorithms (direct implicit mappings) and also sent in OSC to Herman Kogen’s audio renderer.

Relevant code available here: [link]


Eotone (Herman Kolgen)


Client: Herman Kolgen
Website[link]
Year of completion: 2014
Type: Installation

General description: With eotone, Kolgen and Letellier reflect on distance and weather, by staging something intangible yet powerful: the wind. Four sound and sculptural diffusers, containing elements of both the weather vane and the fog horn, make up this monumental installation that renders in movement and sound the direction and force of the wind blowing simultaneously on two continents: in Montreal and Quebec city on one side of the Atlantic, in Rennes and Nantes on the other. The wind data recorded in each city is transmitted live to the diffusers, controlling the orientation of each of the structures and orchestrating the combined chords that make up the harmonic whole perceived at the heart of the installation. By transforming weather data into sound, Eotone offers a subtle artistic vision of the internet of objects.

Description of technological components: LCJ capteurs nautical wind sensors are placed in four different cities (Montreal, Quebec City, Rennes and Nantes), hooked to ethernet Arduinos connected online and posting data to a server (PHP) every two seconds. A computer connected to the internet then gets this data (PHP) on site and distributes to micro-controllers sitting in each of the robotic structure platforms where motor control is performed (PID and positioning made possible with rotary encoders).

Relevant code available here: [link]