Sonic interaction design

Sonic interaction design is the study and exploitation of sound as one of the principal channels conveying information, meaning, and aesthetic/emotional qualities in interactive contexts.[1][2] Sonic interaction design is at the intersection of interaction design and sound and music computing. If interaction design is about designing objects people interact with, and such interactions are facilitated by computational means, in Sonic Interaction Design sound is mediating interaction either as a display of processes or as an input medium.

Research areas

Perceptual, cognitive, and emotional study of sonic interactions

Research in this area focuses on experimental scientific findings about human sound reception in interactive contexts.[3]

During closed-loop interactions, the users manipulate an interface that produces sound, and the sonic feedback affects in turn the users’ manipulation. In other words, there is a tight coupling between auditory perception and action.[4] Listening to sounds might not only activate a representation of how the sound was made: it might also prepare the listener to react to the sound.[5] Cognitive representations of sounds might be associated with action-planning schemas, and sounds can also unconsciously cue a further reaction on the part of the listener.

Sonic interactions have the potential to influence the users’ emotions: the quality of the sounds affects the pleasantness of the interaction, and the difficulty of the manipulation influences whether the user feels in control or not.[6]

Product sound design

Product design in the context of Sonic Interaction Design is dealing with methods and experiences for designing interactive products having a salient sonic behaviour. Products, in this context, are either tangible and functional objects that are designed to be manipulated,[7][8] or usable simulations of such objects as in virtual prototyping. Research and development in this area relies on studies from other disciplines, such as:

In design research for sonic products a set of practices have been inherited from a variety of fields. Such practices have been tested in contexts where research and pedagogy naturally intermix. Among these practices it suffices to mention:

Interactive art and music

In the context of Sonic Interaction Design, interactive art and music projects are designing and researching aesthetic experiences where sonic interaction is in the focus. The creative and expressive aspects – the aesthetics – are more important than conveying information through sound. Practices include installations, performances, public art and interactions between humans through digitally-augmented objects/environments. These often integrate elements such as embedded technology, gesture-sensitive devices, speakers or context-aware systems.

The experience is in the focus, addressing how humans are affected by the sound, and vice versa. Interactive art and music allows us to question existing paradigms and models of how we interact with technology and sound, going beyond paradigms of control (human controlling a machine). Users are part of a loop which includes action and perception.

Interactive art and music projects invite explorative actions and playful engagement. There is also a multi-sensory aspect, especially haptic-audio [18] and audio-visual projects are popular. Amongst many other influences, this field is informed by the development of the roles of instrument-maker, composer and performer merging.[19]

Artistic research in Sonic Interaction Design is about productions in the interactive arts and performing arts, exploiting the role of enactive engagement with sound–augmented interactive objects.[20]

Sonification

Sonification is the data-dependent generation of sound, if the transformation is systematic, objective and reproducible, so that it can be used as scientific method.[21]

For Sonic Interaction Design, sonification provides a set of methods to create interaction sounds that encode relevant data, so that the user can perceive or interpret the conveyed information. Sonification does not necessarily need to represent huge amounts of data in sound, but may only convey one or few data values in a sound. To give an example, imagine a light switch that, on activation would create a short sound that depends on the electric power consumed through the cable: more energy-wasting lamps would perhaps systematically result in more annoying switch sounds. This example shows that sonification aims to provide some information by using its systematic transformation into sound.

The integration of data-driven elements in interaction sound may serve different purposes:

Within the field of Sonification, Sonic Interaction Design acknowledges the importance of human interaction for understanding and using auditory feedback.[24] Within Sonic Interaction Design, sonification can help and offer solutions, methods, and techniques to inspire and guide the design of products or interactive systems.

See also

References

  1. Davide Rocchesso, Stefania Serafin, Frauke Behrendt, Nicola Bernardini, Roberto Bresin, Gerhard Eckel, Karmen Franinović, Thomas Hermann, Sandra Pauletto, Patrick Susini, and Yon Visell, (2008). Sonic interaction design: sound, information and experience. In: CHI '08 Extended Abstracts on Human Factors in Computing Systems (Florence, Italy, April 05 – 10, 2008). CHI '08. ACM, New York, NY, 3969–3972. doi:10.1145/1358628.1358969
  2. Davide Rocchesso and Stefania Serafin, (2009). "Sonic Interaction Design". Editorial of Special Issue. International Journal of Human–Computer Studies 67(11) (Nov. 2009): 905–906. doi:10.1016/j.ijhcs.2009.09.009
  3. Guillaume Lemaitre, Olivier Houix, Yon Visell, Karmen Franinović, Nicolas Misdariis, and Patrick Susini, (2009). "Toward the design and evaluation of continuous sound in tangible interfaces: The Spinotron". International Journal of Human–Computer Studies 67(11) (Nov. 2009): 976–993. doi:10.1016/j.ijhcs.2009.07.002
  4. Salvatore M. Aglioti and Mariella Pazzaglia (2010). "Representing actions through their sound". "Experimental Brain Research" 206(2): 141–151. doi:10.1007/s00221-010-2344-x
  5. Marzia De Lucia, Christian Camen, Stephanie Clarke, and Micah M. Murray (2009). "The role of actions in auditory ob ject discrimination". "Neuroimage" 48(2): 475–485. doi:10.1016/j.neuroimage.2009.06.041
  6. Guillaume Lemaitre, Olivier Houix, Karmen Franinović, Yon Visell, and Patrick Susini (2009). "The Flops glass: a device to study the emotional reactions arising from sonic interactions". In Proc. Sound and Music Computing Conference, Porto, Portugal. Available:
  7. Daniel Hug (2008). Genie in a Bottle: Object–Sound Reconfigurations for Interactive Commodities. In: Proceedings of Audiomostly 2008, 3rd Conference on Interaction With Sound (2008). Available: online
  8. Karmen Franinović, Daniel Hug and Yon Visell (June 2007). Sound Embodied: Explorations of Sonic Interaction Design for Everyday Objects in a Workshop Setting. In: Proceedings of the 13th International Conference on Auditory Display, Montréal, Canada, June 26 – 29, 2007, pp. 334–341. Available: online
  9. Richard H. Lyon, (2003). "Product sound quality-from perception to design". Sound and Vibration 37(3): 18–22. Available: online
  10. See filmsound.org: Learning Space dedicated to the Art of Film Sound Design
  11. See About gamesound.org
  12. Inger Ekman and Michal Rinott, (2010). Using vocal sketching for designing sonic interactions, Aarhus, Denmark: Designing Interactive Systems archive, Proceedings of the 8th ACM Conference on Designing Interactive Systems, ISBN 978-1-4503-0103-9. Available: online.
  13. Sandra Pauletto, Daniel Hug, Stephen Barrass and Mary Luckhurst (2009). Integrating Theatrical Strategies into Sonic Interaction Design. In: Proceedings of Audio Mostly 2009 – 4th Conference on Interaction with Sound (2009), Glasgow, 6 p. Available: PDF and online
  14. Davide Rocchesso, Pietro Polotti, and Stefano delle Monache, (28 December 2009). "Designing Continuous Sonic Interaction". International Journal of Design 3(3). Available: online and PDF
  15. Wendy E. Mackay and Anne Laure Fayard, (1999). Video brainstorming and prototyping: techniques for participatory design, Pittsburgh, Pennsylvania: Conference on Human Factors in Computing Systems, CHI '99 extended abstracts on Human factors in computing systems, ISBN 1-58113-158-5. Available: online
  16. Michel Chion, (1994). Audio-Vision: sound on screen. New York: Columbia University Press, ISBN 0-231-07898-6, ISBN 0-231-07899-4. Book review on f ilmsound.org
  17. Daniel Hug, (2010). "Investigating Narrative and Performative Sound Design Strategies for Interactive Commodities". Lecture Notes in Computer Science, Volume 5954/2010: 12-40, doi:10.1007/978-3-642-12439-6_2. Available: online.
  18. The fifth International Workshop on Haptic and Audio Interaction Design (HAID) on September 16–17, 2010 in Copenhagen, Denmark. http://media.aau.dk/haid10/?page_id=46
  19. International Conference on New Interfaces for Musical Expression http://www.nime.org/
  20. John Thompson, JoAnn Kuchera-Morin, Marcos Novak, Dan Overholt, Lance Putnam, Graham Wakefield, and Wesley Smith, (2009). "The Allobrain: An interactive, stereographic, 3D audio, immersive virtual world". International Journal of Human-Computer Studies 67(11) (Nov. 2009): 934–946. doi:10.1016/j.ijhcs.2009.05.005
  21. Sonification – A Definition http://sonification.de/son/definition
  22. Tobias Grosshauser and Thomas Hermann, (2010). "Multimodal closed-loop Human Machine Interaction" Proc. Interactive Sonification Workshop, Stockholm, http://interactive-sonification.org/ISon2010/proceedings
  23. Till Bovermann, René Tünnermann, and Thomas Hermann. (2010). "Auditory Augmentation - The weather at your fingertips", " International Journal of Ambient Computing and Intelligence", 2(2):27–41. doi:10.4018/jaci.2010040102
  24. Thomas Hermann, and Andy Hunt, (2005). "Guest Editors' Introduction: An Introduction to Interactive Sonification". IEEE MultiMedia 12(2): 20-24. doi:10.1109/MMUL.2005.26

Further reading

External links

This article is issued from Wikipedia - version of the Monday, October 06, 2014. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.