Remember Me
Or use your Academic/Social account:


Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Tom Hall (2016)
Publisher: Anglia Ruskin University
Types: Conference object
Recent decades have seen the establishment of computer software live notations intended as music scores, affording new modes of interaction between composers, improvisers, performers and audience. This paper presents a live notations project situated within the research domains of algorithmic music composition, improvisation, performance and software interaction design. The software enables the presentation of live animated scores which display 2D and 3D pitch-space representations of note collections including a spiral helix and pitch-class clock. The software has been specifically engineered within an existing sound synthesis environment, SuperCollider, to produce tight integration between sound synthesis and live notation. In a performance context, the live notation is usually presented as both music score and visualisation to the performers and audience respectively. The case study considers the performances of two of the author's contrasting compositions utilising the software. The results thus far from the project demonstrate the ways in which the software can afford different models of algorithmic and improvised interaction between the composer, performers and the music itself. Also included is a summary of feedback from musicians who have used the software in public music performances over a number of years.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • [10] M. Leman, Embodied Music Cognition and Mediation Technology. Cambridge, Mass.: MIT Press, 2007.
    • [11] P. Couprie, “Music http://logiciels.pierrecouprie.fr/.
    • [12] Chew, E., and Franc¸ois, A. R. J., “Interactive MultiScale Visualizations of Tonal Evolution in MuSA.RT Opus 2,” ACM Computers in Entertainment, vol. 3, no. 4, p. 16, 2005.
    • [13] Hexler, “Touchosc,” http://hexler.net/software/touchosc.
    • [14] J. Pressing, “Improvisation: methods and models,” in Generative Processes In Music / The Psychology of Performance, Improvisiation and Compostiion, J. A. Sloboda, Ed. Oxford: Clarendon Press, 1988.
    • [15] R. Hoadley, “Calder's Violin: real-time notation and performance through musically expressive algorithms,” in Proceedings of the International Computer Music Conference, IRZU, Ljubljana, Slovenia, September 9- 15 2012, pp. 188-193.
    • [16] D. Behrman, “What Indeterminate Notation Determines,” Perspectives of New Music, vol. 3, no. 2, pp. 58-73, 1965.
    • [17] D. Bailey, Improvisation / Its Nature and Methods. Boston, Massachusetts: Da Capo Press, 1992.
    • [18] Nash, C. and Blackwell, A. F., “Liveness and Flow in Notation Use,” in Proceedings of New Interfaces for Musical Expression NIME, University of Michigan Ann Arbor, MA, 2012, pp. 28-33.
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article