Remember Me
Or use your Academic/Social account:


Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Brown, D. (2016)
Publisher: University of Sussex
Languages: English
Types: Part of book or chapter of book

Classified by OpenAIRE into

The “mapping problem” is a long standing issue in the development of digital musical instruments, and occurs when an instrument’s musical response fails to reflect the performer’s gestural actions. This paper describes ongoing doctoral research that seeks to address this issue by studying the existing gestural languages of Soundpainting and American Sign Language in order to influence the control mappings of free-hand gestural musical instruments. The research seeks to contribute a framework of mapping strategies influenced by these gestural languages, as well as developments in the field of gesture recognition algorithms and novel evaluation methods for free-hand gestural musical instruments.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • Barbosa, Jerônimo, Filipe Calegario, Veronica Teichrieb, Geber Ramalho, and Patrick McGlynn. 2012, May. “Considering Audience's View Towards an Evaluation Methodology for Digital Musical Instruments.” Proceedings of New Interfaces for Musical Expression (NIME). Ann Arbor, MI, USA.
    • Bellotti, Victoria, Maribeth Back, W Keith Edwards, Rebecca E Grinter, Austin Henderson, and Cristina Lopes. 2002, April. “Making Sense of Sensing Systems: Five Questions for Designers and Researchers.” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, Minneapolis, MN, USA, 415-422.
    • Bevilacqua, Frédéric, Bruno Zamborlin, Anthony Sypniewski, Norbert Schnell, Fabrice Guédy, and Nicolas Rasamimanana. 2010. “Continuous Realtime Gesture Following and Recognition.” In Gesture in Embodied Communication and Human-Computer Interaction, 73-84. Springer Berlin Heidelberg.
    • Bobick, Aaron F, and Andrew D Wilson. 1997. “A State-Based Approach to the Representation and Recognition of Gesture.” IEEE Transactions on Pattern Analysis and Machine Intelligence 19 (12): 1325-1337.
    • Caramiaux, Baptiste, Nicola Montecchio, Atau Tanaka, and Frédéric Bevilacqua. 2014. “Adaptive Gesture Recognition with Variation Estimation for Interactive Systems.” ACM Transactions on Interactive Intelligent Systems (TiiS) 4 (4): 18.
    • Chandler, Daniel. 2007. Semiotics: the basics. Routledge.
    • Dobrian, Christopher, and Daniel Koppelman. 2006, June. “The 'E' in NIME: Musical Expression with New Computer Interfaces.” Proceedings of New Interfaces for Musical Expression (NIME). Paris, France.
    • Fels, Sidney, Ashley Gadd, and Axel Mulder. 2002. “Mapping Transparency Through Metaphor: Towards More Expressive Musical Instruments.” Organised Sound 7 (2): 109-126.
    • Fyans, A Cavan, Michael Gurevich, and Paul Stapleton. 2009, April. “Spectator Understanding of Error in Performance.” CHI'09 Extended Abstracts on Human Factors in Computing Systems. ACM, Boston, MA, USA, 3955-3960.
    • . 2010, June. “Examining the Spectator Experience.” Proceedings of New Interfaces for Musical Expression (NIME), Volume 10. Sydney, Australia, 1-4.
    • Hunt, Andy, and Ross Kirk. 2000. “Mapping Strategies for Musical Performance.” Trends in Gestural Control of Music 21:231-258.
    • Hunt, Andy, Marcelo Wanderley, and Ross Kirk. 2000, September. “Towards a Model for Instrumental Mapping in Expert Musical Interaction.” Proceedings of the International Computer Music Conference. Berlin, Germany, 209-212.
    • Hunt, Andy, Marcelo M Wanderley, and Matthew Paradis. 2003. “The Importance of Parameter Mapping in Electronic Instrument Design.” Journal of New Music Research 32 (4): 429-440.
    • Jordà, Sergi. 2004, November. “Digital Instruments and Players: Part II-Diversity, Freedom and Control.” Proceedings of the International Computer Music Conference. Miami, FL, USA, 706-710.
    • Lakoff, George, and Mark Johnson. 1980. Metaphors we live by. University of Chicago press.
    • Lieberth, Ann K, and Mary Ellen Bellile Gamble. 1991. “The Role of Iconicity in Sign Language Learning by Hearing Adults.” Journal of Communication Disorders 24 (2): 89-99.
    • Miranda, Eduardo Reck, and Marcelo M Wanderley. 2006. New digital musical instruments: control and interaction beyond the keyboard. AR Editions, Inc.
    • Mitchell, Thomas, and Imogen Heap. 2011, June. “SoundGrasp: A Gestural Interface for the Performance of Live Music.” Proceedings of New Interfaces for Musical Expression (NIME). Oslo, Norway.
    • Modler, Paul. 2000. “Neural Networks for Mapping Hand Gestures to Sound Synthesis Parameters.” Trends in Gestural Control of Music, vol. 18.
    • Mulder, Axel. 1994, August. “Virtual Musical Instruments: Accessing the Sound Synthesis Universe as a Performer.” Proceedings of the First Brazilian Symposium on Computer Music. Minas Gerais, Brazil, 243-250.
    • Nakra, Teresa Marrin. 2002. “Synthesizing Expressive Music Through the Language of Conducting.” Journal of New Music Research 31 (1): 11-26.
    • Nielsen, Jakob. 1994. “Heuristic Evaluation.” Usability Inspection Methods 17 (1): 25-62.
    • O'Modhrain, Sile. 2011. “A Framework for the Evaluation of Digital Musical Instruments.” Computer Music Journal 35 (1): 28-42.
    • Rovan, Joseph Butch, Marcelo M Wanderley, Shlomo Dubnov, and Philippe Depalle. 1997, October. “Instrumental Gestural Mapping Strategies as Expressivity Determinants in Computer Music Performance.” Proceedings of Kansei - The Technology of Emotion Workshop. Genoa, Italy, 3-4.
    • Stowell, Dan, Mark D Plumbley, and N. Bryan-Kinns. 2008, June. “Discourse Analysis Evaluation Method for Expressive Musical Interfaces.” Proceedings of New Interfaces for Musical Expression (NIME). Genova, Italy, 81-86.
    • Taub, Sarah F. 2001. Language from the body: Iconicity and metaphor in American Sign Language. Cambridge University Press.
    • Thompson, Walter. 2006. Soundpainting: the art of live composition. Workbook I. New York, NY, USA: Walter Thompson.
    • Wanderley, Marcelo Mortensen, and Nicola Orio. 2002. “Evaluation of Input Devices for Musical Expression: Borrowing Tools from HCI.” Computer Music Journal 26 (3): 62-76.
    • Wessel, David, and Matthew Wright. 2002. “Problems and Prospects for Intimate Musical Control of Computers.” Computer Music Journal 26 (3): 11-22.
    • Winkler, Todd. 1995. “Making Motion Musical: Gesture Mapping Strategies for Interactive Computer Music.” Proceedings of the International Computer Music Conference. Banff Centre for the Arts, Canada, 261-264.
    • Wong, Peter Ho-Kin. 2011. “Conceptual Metaphor in the Practice of Computer Music.” MSc, Mills College.
    • Yamato, Junji, Jun Ohya, and Kenichiro Ishii. 1992, June. “Recognizing Human Action in Time-Sequential Images Using Hidden Markov Model.” Proceedings of the 1992 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. IEEE, 379-385.
  • No related research data.
  • No similar publications.

Share - Bookmark

Download from

Cite this article