IEEE Multimedia's last issue is about interactive (with) sonification. The point is that sonification "presents information by using sound (particularly non speech), so thatthe user of an auditory display obtains a deeper understanding of the data or processes under investigation by listening". Interactive sonification is then defined as "the use of sound within a tightly closed human–computer interface where the auditory signal provides information about data under analysis, or about the interaction itself, which is useful for refining the activity".This issue provides the reader with a good review of cutting edge projects:
- Zhao et al. report on “Interactive Sonification of Choropleth Maps.” The extension of visual maps is not only interesting for blind people, it also inspires us to consider the extension of other visual techniques into the auditory domain.
- Fernström, Brazil, and Bannon present in their article, “HCI Design and Interactive Sonification for Fingers and Ears,” an investigation of an audio-haptic interface for ubiquitous computing. This highlights how human beings can use the synergies between data presented in different modalities (touch, sound, and visual displays).
- In their article, “Sonification of User Feedback through Granular Synthesis,” Williamson and Murray-Smith report on the progress in the domain of high-dimensional data distributions, one of the most appropriate applications of sonification.
- From a completely different angle, Effenberg discusses in his article, “Movement Sonification: Effects on Perception and Action,” the enhanced motor perception in sports by using an auditory display. Effects on perception and action are reported from a psychophysical study.
- In “Continuous Sonic Feedback from a Rolling Ball,” Rath and Rocchesso demonstrate the use of an interface bar called the Ballancer. Although this interface is not yet used to explore independent data, it is an ideal platform for studying the interaction at the heart of an auditory interaction loop.
- Hinterberger and Baier present the Poser system in “Parametric Orchestral Sonification of EEG in Real Time.” The electroencephalogram is an interesting type of signal for sonification because it involves temporal, spectral, and spatial organization of the data.
- Finally, in “Navigation with Auditory Cues in a Virtual Environment,” Lokki and Gröhn show how sonification can enhance navigation and operation in spaces that so far have only been explored visually.
My favorite is Fernström, Brazil, and Bannon's project.