Like us elsewhere!

 

Subscribe - RSS feed
newsletter
E-mail address:
 

Entries in data (5)

Monday
Dec082014

Sonify... Wikipedia

Sonification, and especially data-sonification, is still an underused technique. I’ve been quite interested in sonifications, and have heard both very useful, as well as utterly rubbish applications. I’ve been trying to wrap my head around which sonifications work, and which don’t.

In the “Sonify…” posts, I will post about different ways of sonifying data. This time: Sonfiying Wikipedia. Listen to Wikipedia by Hatnote is a sonification and visualisation of changes being made to Wikipedia. Hatnote is Mahmoud Hashemi and Stephen LaPorte, both interested in “Wiki life”.

“Listen to Wikipedia” sonifies changes from Wikipedia-articles in real time. Bell sounds indicate additions, and string plucks indicate subtractions to an article. Pitch changes according to the size of the edit. It’s worth noting that Wikipedia is maintaned by both bots and humans, and it’s only through these web experiments that we can see or hear that labour force.

What do you think? Is this a good sonification of the data of Wikipedia?

Wednesday
Oct082014

Ryoji Ikeda's Superposition

We’ve seen the works of Paris-based artist Ryoji Ikeda before. They are often raw, glitchy works exploring data sonifications and, more recently, the combination with visuals.

Ryoji’s latest work, or rather update of the work superposition is described as follows:

A multimedia music, visual, and theater work at the intersection of art and science, superposition, inspired by the subatomic world, mines the notion that it is not possible to fully describe the behavior of a single particle except in terms of probabilities. The work is an immersive experience, an orchestrated journey through sound, language, physical phenomena, mathematical concepts, human behavior, and randomness, all simultaneously arranged and rearranged in a theatrical arc that obliterates the boundaries between music, visual arts, and performance.

To achieve this, Ryoji has two performers generate the materials live; videos, point clouds, text, sounds, and superimposes these over 21 screens. Premiering in the US this month on October 17th and 18th at the Metropolitan Museum of Art in New York.

Friday
Mar142014

Visualising Porto's soundscape

Back in January we saw the Stereopublic project, which crowdsourced the quiet, and used it as an inspiration for short musical pieces as well. The URB project in Porto, Portugal takes a more academic approach, very carefully measuring and analysing the urban soundscape.

URB is a soundscape storage and analysis system idealized by José Alberto Gomes and developed in partnership with Diogo Tudela. URB’s goal is to keep record of the sonic profile of Porto, allowing researchers and artists to use the dataset freely within their own projects.

Using four Raspberry Pi’s equipped with a soundcard and an electret mic spread throughout the city, URB constantly listens to the environment and stores sonic features in an on-line public database. See the map above for where they placed the listening spots.

The datasets are freely available online, but you can also navigate them using URB XY, a data visualisation tool by Diogo Tudela. It’s interesting to see the differences between day and night, for example. Analysed properties like amplitude, zero-crossings, irregularity, spectral centroid, etc. are all very easily viewable and the tool is great to get a grip on the urban soundscape.

What I really like about this is that the data can be used for multiple purposes. From giving the government insight on noise pollution in a city and using this info in city planning, to artistic purposes. Which is exactly what the We The Citizens project (above) is about; using the data from the URB-system in artistic ways to make the audience aware of the sound ecology of the city. I hope this’ll happen in more cities, as most citizens are still unaware of the effects of noise pollution and sound ecology.

Tuesday
Mar022010

Music visualization: Narratives 2.0

In his keynote for the Sonic Acts festival last week, Dirk de Kerkhoven spoke about data visualization and he briefly showed Narratives 2.0 by Matthias Dittrich. The program visualizes music by segmenting it in different channels and showing them in a fan-like manner. The angle of the line is determined by the frequency of the channel while high levels are colored orange. 

In the image above we see the result for Beethoven’s fifth symphony. Quite beautiful. It makes me want to listen to the piece and see if I can follow the lines and intensities. It is nice to look at a piece of music from a different perspective. The purpose of the project is not to create an exact mapping of the frequencies, but to have an aesthetic, artistic representation of it.

Wednesday
Sep162009

Reflection, a data sculpture by Benjamin Maus

Reflection, a data sculpture by Benjamin Maus, was inspired by a musical piece by Frans de Waard. Software was used to analyze the frequencies of the music. Unlike some projects we looked at before (Cylinder and this project), it's not the visualization of one sound, but of a complete piece of music.

Why do we want to see everything? Some things appear to be more real if you can see them, or touch them. We will not be able to tell what the music sounded like by looking at a sculpture like this. It does look quite fascinating though.