top of page

Our Method
Unveiling the Sound of Science.

 

Welcome to the core of Project Harmonics, where we demystify the intricate world of science through the enchanting medium of sound. Our methodology is a symphony of science and music, meticulously designed to make complex concepts accessible and captivating.

Harmonics_img_9.png

The Art of Data Sonification

 
 

At the core of our methodology lies the art of data sonification. We take raw data from potential visual cues and translate it into beautiful melodies that can be both heard and felt. This process involves mapping data attributes to sound, creating a sensory experience that brings science to life.
 

When Harmonics was first conceptualized for a CERN outreach project, it was designed as a particle physics data interpreter for the visually impaired, converting datasets into audio cues. In this methodology, we discuss the methods we utilized for the outreach Harmonics program.

Harmonics_img_10.png

 

 

 

 

 

 

 

 

 

Our process begins with a dataset in CSV format. We analyze the data to identify variables with attributes suitable for sonification. In our demo, the particle collision dataset includes variables such as ID (particle type), P (momentum), theta (angle), beta (angle), nphe (number of photoelectrons), Ein (inner energy), and Eout (outer energy).

​

Our analysis revealed that ID, P, and theta were suitable for sonification because they could be mapped to MIDI attributes such as notes, pitch bend, and velocity. While other variables could have been mapped, they were less ideal due to rapid fluctuations or constant values, which could create confusion for the listener. We aim to strike a balance between scientific accuracy and musical clarity, ensuring an engaging and informative auditory experience.

 

 

 

 

 

 


 

 

​

​

​

​

​

Accordingly, data is cleaned: unnecessary variables are removed and for this demo we limited the dataset to a thousand rows. 

Harmonics_Process_5.png
Harmonics_img_10.png

Python: Our core language for processing

 
 

​To orchestrate this transformation, we employ the power of Python, a versatile and powerful programming language. Python allows us to manipulate and process data with precision, providing the foundation for our sonification process. By diving deep into the code, you'll gain insight into how Python is harnessed to craft these unique musical representations of scientific data.

harmonics_img_8.png

 

 

​

 

 

 

 

 

 

 

 

 

 

Our Python code takes the dataset in CSV format, containing critical particle attributes, including particle type, momentum, and deflection angle. It operates through a systematic process to make this transformation possible.

 



 

 

 

 

 

 

 

 

 

 

 

In the first phase, the Python code maps particle IDs to MIDI notes, associating each particle type with a specific musical note. It also converts momentum into MIDI velocity, determining the speed at which the notes are played. Additionally, the code calculates pitch bend values based on the deflection angle, enabling dynamic pitch changes in the sound representation. 

 

 

 

 

 



 

 

 

 

 

 

 

The Python code performs the arrangement of musical events within a MIDI file. It adds note-on events, pitch bend events, and note-off events, meticulously timed to represent the data with precision. To ensure smooth transitions between these events, a grace period is incorporated, enhancing the clarity and cohesiveness of the composition. 

 

 

 

 

 

 



 

 

​

​

​

​

The final step in the Python code's journey is the preservation of the auditory output. The generated MIDI file is stored for further processing in our DAW (Digital Audio Workstation).

Harmonics_process_1.png
harmonics_process_2.png
Harmonics_process_3.png
Harmonics_process_4.png

Soundtrap: Crafting the Soundscape

 
 

Our journey continues as we transition to Soundtrap, an online digital audio workstation (DAW). Here, the scientific data, now encoded as MIDI files, is further shaped into captivating compositions. Soundtrap provides a space where music elements like melody, harmony, and rhythm come together to craft an auditory experience mirroring the complexities of the dataset.

harmonics_img_3.png
Harmonics_process_6.png

In Soundtrap, we breathe life into our raw MIDI data by adding instruments and effects. For this demo, we chose synths, pianos, and pads to create an ethereal atmosphere.

harmonics_img_3.png

Next, we refine the rawness of our interpolated MIDI file using effects. This includes an electric guitar amp, tremolo, and a two-band EQ for shaping the sound. To enhance the ethereal atmosphere, we introduce reverb, stereo delay, and a compressor.

Harmonics_img_9.png

Lastly, our track layering. We've organized three tracks for this demo. The main track features the MIDI file created in this demo, with a 0.7-second note-on time and a 0.3-second grace period. The primary background track utilizes the same MIDI file, replacing the piano with a pad to add a spacious ambiance. The secondary background track is different, using the same MIDI track but with a 0.1-second note-on time and no grace period, resulting in a rapid melody, which is smoothed out using a synth and effects.

The result

 
 

The music in this track has a unique mood - it feels dreamy and open. The fun part is it is all synthesized from a particle data set. We're trying to make particle physics more interesting and understandable for everyone, in a way that connects with how people feel and experience things.

​

You can play the track and learn below what it is trying to convey.

Project Harmonics - 01

The piano's volume reflects the particle's momentum. Louder sounds represent higher momentum, while softer sounds indicate lower momentum.

​

To identify the detected particle type, we use four distinct notes. The highest note corresponds to a Positron (G3), followed by a Pion (D3), then a Kaon (E3), and finally, a proton (C3).

​

Look out for pitch changes when a note is playing, a higher or lower pitch bend implies that the theta value is increasing or decreasing for particle deflection angle. 

Resources for further exploration

​

​Complete Code for Data Sonification:
Access the full Python code used for data sonification in our project. You can explore the code to understand how particle attributes are transformed into captivating melodies. Link to Python Code

​

Soundtrap Composition:
Experience the final auditory composition created using Soundtrap. Listen to how scientific data is transformed into music, bridging the gap between sight and sound in the world of particles. Link to Soundtrap Composition

​

Particle Dataset (CSV File):
Download the raw particle dataset in CSV format. This dataset served as the foundation for our sonification process. Link to Particle Dataset CSV (cleaned)

Link to Particle Dataset CSV (Original)

 
 

© 2023. All rights reserved. Soundtrap is a trademark of Spotify AB.

bottom of page