Awards & Nominations

IncK has received the following awards and nominations. Way to go!

Local Peoples' Choice Winner
Global Nominee

The Challenge | Artify the Earth

Use NASA Earth imagery data to create 1) an art piece, or 2) a tool that allows the imagery to be manipulated to create unique pieces of art.

Solar System Orchestra

Transforming solar system movements into sound waves. Hear the music of the universe.



1) theoretical findings

Solar System Orchestra takes it roots in the artistic movement called musicalisme. The Lausanne-born artist Charles Blanc-Gatti (1890 - 1966) is one of its famous representative. Painter and pioneer in animated films, he studied the symptoms of synesthesia and how music can be represented by colors and shapes. He expressed some of his findings with diagrams, tables and chromatic discs.

The three dominant colors of each planet have been extracted and matched with Blanc-Gatti tables to define specific keys.

Planets were now music, notes and chords.

Through the use of multiple synthesisers, continuous sound waves reflecting the "endless" cyclic vibration of each planet were produced.

An orchestra where Earth is the maestro, listening to its perpetually moving sister planets.

One of the data reflecting this dynamics was the distance of each planet to Earth, extracted from the Ephemerides of Horizons NASA dataset.

Effects and filters were then added to create a unique dynamic variation of each sound wave.

The result is an orchestra, where each planet is an instrument, defined by the color of its surface, size, orbital speed and relative distance to Earth.

An artistic and scientific digital orchestra, expressing a sonic representation of the solar system.


The development process have been through different distincts steps in order to design a specific experience.

It begun with the problem of having live video streams (such as Space Live Feeds) that were muted or with rare radio communication sounds. Pixel synthesisers were tested but the results were not satisfying, they were creating abstract sounds out of video feed and the interest quickly appeared to be limited. Especially, we could not create a sound that was indexed to a tangible data set.

Then, the idea to have spatial datas modifying the several parameters of a synthesiser by spatial data came.

The first references we used were toward Musica Universalis, Music of the spheres. "It is an ancient philosophical approach that regards proportion in the movement of celestial bodies as a form of musica" (Source: Wikipedia). This "earth-centered/anthropo-centered" approach of the Universe has been kept, as reference, in opposition to the standard helio-centric approach.

Inc.K take on "Artifying the earth" was now "Artifying the earth and its relation to the solar system".

To pursue the research, the project value chain was separated in two work parts:

- The beginning, where the necessity was to explore the vast amounts of data to find dynamic, near real time, solar system, earth related data relative to other planets.

- The end, where a synthesiser had to be fully setup with its oscillators, filters, frequencies etc. As 7 chords had to play, harmony has been researched during an experimental phase conveyed to setup the synthesisers. This synthesiser has been developed with Javascript.

While this two steps were being worked on, a first flat version of the User Interface and Experience have been experimented , as well as the global identity, including logo, typography, color code...

Asteroids have also been included to produce a less wavy sound but more punctual, in the system.

Finally, the flat design has been brought together with the Javascript tool onto a HTML/CSS combination to create a web application hosted on Inc.K server.


Perpetual signature sounds have been designed to translate the vibration of each planets, based on different data sets extracted from NASA database. Some of them have been downloaded and some are directly connected by one of NASA APIs.

Static datasets:

  • A color for a key. Dominant colors from the planets of the solar system were extracted from NASA RGB images and converted into wavelength in the scale of nanometers. At the same time musical frequencies were considered as waves traveling in vacuum and the wavelength in meters was matched with values of the colors' wavelengths in nanometers.The three dominants colors of the surface of each planet defines three different keys, forming a chord.
  • A size for an octave. The bigger the planet is, the lower is the key.

Dynamic datasets:

  • Orbital speed modifies the cutoff of the low pass filter.
  • Rotation speed variation modifies the key fine tuning.
  • Distance from each planet to earth defines the volume’s variations and the pan.
  • Asteroids are daily datas applied to the earth’s full revolution. The sound of each is determined by its magnitude / speed and nearest distance to earth. They have been used to punctuate the whole system with specific sounds.(Dynamic)

About the scale: One full revolution of the solar system (Neptune around the sun) takes approximatively 165 year in real time, which makes 6 minutes in this representation.

4) Current Version & Future developments

The actual version runs on HTML 5, CSS 3 & JAVASCRIPT. It is available on Github for testing and running.

Future developments includes:

- Integration of additional edition layers:

- User auto-generated data (does your IP is day/night, hot/cold temperature, southern/northern hemisphere...). It could be used to modify either sound parameters or visual (the color of background for example).

- Using dates starting and ending the process, inputing datas related to this dates and producing a fast forward version of months of years for example. Also, having the possibility to activate "Live Data Feed" mode, when on, the date "start and end" are deactivated and only the live feed prevails.

- Be able to edit musical parameters in real time, mainly effects, to make the user interface more "playful".

- Add the possibility to change the synthesiser for one or more other instruments such as flutes or chords.

- End to end integration in the orchestra of our Python code to have near real time earth-planet distance data.

- In depth modulation research to modify the tonality of synthesiser so that they could (with a trigger switch for example) have a more "real/identifiable" sound. For example, Mars would sound like "warmish/gazy" whereas Uranus would sound like "water/liquid".

- Integration of additional numerical data visualisation boxes, for example, to visualise the live feed of the distance for a given planet.

5) Inc.k TEAM

Incorporating knowledges from art history, graphic design, astronomy, media engineering, chemistry, data science, and user experience to offer you an innovative poetico-scientific visualisation, the Solar System Orchestra.

Stefania Bertella:

Philippe Cuendet:

Admir Demir:

Yann Heurtaux:

Patrick Matter

Raphael Schmutz:

Anil Tuncel:

David Viale:

6) Sources:




SpaceApps is a NASA incubator innovation program.