Wednesday, April 13, 2016

Listen to the data with the Visual Process Analytics


Visual analytics provides a powerful way for people to see patterns and trends in data by visualizing them. In real life, we use both our eyes and ears. So can we hear patterns and trends if we listen to the data?

I spent a few days studying the JavaScript Sound API and adding simple data sonification to our Visual Process Analytics (VPA) to explore this question. I don't know where including the auditory sense to the analytics toolkit may lead us, but you never know. It is always good to experiment with various ideas.


Note that the data sonification capabilities of VPA is very experimental at this point. To make the matter worse, I am not a musician by any stretch of the imagination. So the generated sounds in the latest version of VPA may sound horrible to you. But this represents a step forward to better interactions with complex learner data. As my knowledge about music improves, the data should sound less terrifying.

The first test feature added to VPA is very simple: It just converts a time series into a sequence of notes and rests. To adjust the sound, you can change a number of parameters such as pitch, duration, attack, decay, and oscillator types (sine, square, triangle, sawtooth, etc.). All these options are available through the context menu of a time series graph.

At the same time as the sound plays, you can also see a synchronized animation of VPA (as demonstrated by the embedded videos). This means that from now on VPA is a multimodal analytic tool. But I have no plan to rename it as data visualization is still and will remain dominant for the data mining platform.

The next step is to figure out how to synthesize better sounds from multiple types of actions as multiple sources or instruments (much like the Song from Pi). I will start with sonifying the scatter plot in VPA. Stay tuned.

4 comments:

Unknown said...

Charles, this is a great project. The work you are doing to visualize sounds through Visual Processing Analytics and sonification are really beautiful.

Some of the references I used in my TED-Ed might be of possible interest, at http://ed.ted.com/lessons/music-and-math-the-genius-of-beethoven-natalya-st-clair#digdeeper. I also like this talk about music, math, and symmetry: https://www.youtube.com/watch?v=UcIxwrZV10A&feature=youtu.be. Finally, check out some math and music visualizations, at https://vimeo.com/69715960 & https://youtu.be/ipzR9bhei_o?list=PL94A6AD2D44DF34B5 & https://www.youtube.com/watch?v=APQ2RKECMW8.

Have fun with the scatterplots you create, and again, "sounds" like the beginnings of a great project!

Charles Xie said...

Thanks, Natalya. Your videos are exactly what I need to learn the theory of music.

Unknown said...

You got me thinking about all the attributes of a musical note that might be tied to data.
* Pitch, as used in your prototype
* Duration
* Volume
* Timbre, roughly determined by the amplitude of the fundamental frequency's harmonics
* Envelope, the variation over the duration of the note of pitch, volume, and timbre

But a sound need not be musical to be useful in sonification. Different event types in the data could produce clicks, taps, slaps, etc.

Charles Xie said...

Duration and envelope were implemented in the example and can be controlled by the data miner. I think a sound should be pleasant to listen to. Otherwise it will drive the miner crazy as he has to listen to so many of this.