LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Knight, Dawn
Languages: English
Types: Unknown
Subjects:
Current methodologies in corpus linguistics have revolutionised the way we look at language. They allow us to make objective observations about written and spoken language in use. However, most corpora are limited in scope because they are unable to capture language and communication beyond the word. This is problematic given that interaction is in fact multi-modal, as meaning is constructed through the interplay of text, gesture and prosody; a combination of verbal and non-verbal characteristics. This thesis outlines, then utilises, a multi-modal approach to corpus linguistics, and examines how such can be used to facilitate our explorations of backchanneling phenomena in conversation, such as gestural and verbal signals of active listenership. Backchannels have been seen as being highly conventionalised, they differ considerably in form, function, interlocutor and location (in context and co-text). Therefore their relevance at any given time in a given conversation is highly conditional. The thesis provides an in-depth investigation of the use of, and the relationship between, spoken and non-verbal forms of this behaviour, focusing on a particular sub-set of gestural forms: head nods. This investigation is undertaken by analysing the patterned use of specific forms and functions of backchannels within and across sentence boundaries, as evidenced in a five-hour sub-corpus of dyadic multi-modal conversational episodes, taken from the Nottingham Multi-Modal Corpus (NMMC). The results from this investigation reveal 22 key findings regarding the collaborative and cooperative nature of backchannels, which function to both support and extend what is already known about such behaviours. Using these findings, the thesis presents an adapted pragmatic-functional linguistic coding matrix for the classification and examination of backchanneling phenomena. This fuses the different, dynamic properties of spoken and non-verbal forms of this behaviour into a single, integrated conceptual model, in order to provide the foundations, a theoretical point-of-entry, for future research of this nature.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • Abercrombie, D. (1963) Studies in phonetics and linguistics. London: Oxford University Press.
    • Adolphs, S. (2006) Introducing electronic text analysis- A practical guide for language and literary studies. London: Routledge.
    • Adolphs, S. (2008) Corpus and Context: investigating pragmatic functions in spoken discourse. Amsterdam: John Benjamins.
    • Aijmer, K. (1987) Oh and Ah in English conversation. In Meijs, W. (Ed.) Corpus Linguistics and Beyond: Proceedings of the Seventh International Conference on English Language Research on Computerized Corpora. Amsterdam: Rodopi. pp.61-86.
    • Aist, G., Allen, J., Campana, E., Galescu, L., G´omez Gallo, C., Stoness, S., Swift, M. and Tanenhaus, M. (2006) Software architectures for incremental understanding of human speech. Proceedings of Interspeech 2006. Pittsburgh PA: USA. pp.1922-1925.
    • Alibali, M. W., Kita, S., and Young, A. (2000) Gesture and the process of speech production: We think, therefore we gesture. Language and Cognitive Processes, 15: pp.593-613.
    • Alibali, M. W., Heath, D. C., and Myers, H. J. (2001) Effects of visibility between speaker and listener on gesture production: Some gestures are meant to be seen. Journal of Memory and Language, 44: pp.169-188.
    • Allen, J. and Core, M. (1997) Draft of DAMSL: Dialog Act Markup in Several Layers [online report]. Available at: http://www.cs.rochester.edu/research /speech/damsl/RevisedManual/ [Accessed 10 November 2008].
    • Allen, D.E. and Guy, R.F. (1974) Conversation analysis: the sociology of talk. The Hague: Mouton.
    • Allwood, J., Björnberg, M., Grönqvist, L., Ahlsen, E. and Ottesjö, C. (2000) The Spoken Language Corpus at the Department of Linguistics, Göteborg University. FQS- Forum: Qualitative Social Research 1(3) [online]. Available at: http://www.qualitative-research.net/fqs/ [Accessed 16 December 2008].
    • Allwood, J., Grönqvist, L., Ahlsén, E. and Gunnarssan, M. (2001) Annotations and tools for an activity based spoken language corpus. Proceedings of the Second SIGdial Workshop of Discourse and Dialogue 16. Morristown, NJ: Association for Computational Linguistics. pp.1-10.
    • Allwood, J., Cerrato, L., Jokinen, K., Navarretta, C and Paggio, P. (2007a) The MUMIN coding scheme for the annotation of feedback, turn management and sequencing phenomena. Language Resources and Evaluation 41(3): pp.273-287.
    • Allwood, J., Kopp, S., Grammer, K., Ahlsén, E., Oberzaucher, E. and Koppensteiner, M. (2007b) The analysis of embodied communicative feedback in multimodal corpora: a prerequisite for behaviour simulation. Language Resources and Evaluation 41(3): pp.255-272.
    • Allwood, J., Nivre, J. and Ahlsén, E. (1993) On the semantics and pragmatics of linguistic feedback. Journal of semantics 9(1): pp.1-26.
    • Altorfer, A., Jossen, S., Würmle, O., Käsermann, M. L., Foppa, K., and Zimmermann H. (2000) Measurement and meaning of head movement behavior in everyday face-to-face communicative interaction. Behavior Research Methods, Instruments and Computers, 32(1): pp.17-32.
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article