Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Oompah.py – Big Brass meets Big Data (leedsdatamill.org)
28 points by ntoll on March 25, 2014 | hide | past | favorite | 5 comments


This post uses the phrase "auralizing data" but I've also read "sonifying data" elsewhere. Or "auralization" and "sonification", e.g.

https://en.wikipedia.org/wiki/Sonification

It's the sound analogue of visualization.

I've seen the general idea being called perceptualization. One could also map data to smells, or felt temperatures, or smooth/rough textures, or…


Does the outcome sound well enough to be performed by orchestra? A link to the audio would be appreciated.


There's no audio as per the article no one has played it yet. Just reading the sheet music, though, I don't know that it's actually playable or that if it's all within the human range of hearing.


I'm pretty sure, since you'll need 56va or more type notations to get it back onto the treble clef, where "8va" indicates play 1 octave higher than notated, that many/most of the notes will be above 18kHz. Before my Dad lost my flute, i used to be pretty good at reading notation in the 2 octaves above middle C range, but that hurts my eyes. Maybe you could revoice all the chords and we can figure out harmonic structure


Correct - as the article demonstrated, we achieved rather buggy results in the short time we had available to us. We're continuing to work on the project and have plans to get the piece performed for LeedsDataMill by a real brass band. I'm guessing the performance will be blogged and I'll cross post it to HN when it's published.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: