LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Publisher: IEEE/ACM
Languages: English
Types: Unknown
Subjects:

Classified by OpenAIRE into

ACM Ref: InformationSystems_INFORMATIONINTERFACESANDPRESENTATION(e.g.,HCI)
Emphasis, by means of either pitch accents or beat gestures (rhythmic co-verbal gestures with no semantic meaning), has been shown to serve two main purposes in human communication: syntactic disambiguation and salience. To use beat gestures in this role, interlocutors must be able to integrate them with the speech they accompany. Whether such integration is possible when the multi-modal communication information is produced by a humanoid robot, and whether it is as efficient as for human communicators, are questions that need to be answered to further understanding of the efficacy of humanoid robots for naturalistic human-like communication.\ud \ud Here, we present an experiment which, using a fully within subjects design, shows that there is a marked difference in speech and gesture integration between human and robot communicators, being significantly less effective for the robot. In contrast to beat gestures, the effects of speech emphasis are the same whether that speech is played through a robot or as part of a video of a human. Thus, while integration of speech emphasis and verbal information do occur for robot communicators, integration of non-informative beat gestures and verbal information does not, despite comparable timing and motion profiles to human gestures.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • [1] A. Aly and A. Tapus. A model for synthesizing a combined verbal and nonverbal behavior based on personality traits in human-robot interaction. In HRI '13 Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction, pages 325{332. IEEE Press, Mar. 2013.
    • [2] R. H. Baayen. Analyzing Linguistic Data A Practical Introduction to Statistics using R. Cambridge University Press, 2008.
    • [3] P. Bremner, A. G. Pipe, M. Fraser, S. Subramanian, and C. Melhuish. Beat gesture generation rules for human-robot interaction. In RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication, pages 1029{1034. IEEE, Sept. 2009.
    • [4] P. Bremner, A. G. Pipe, C. Melhuish, M. Fraser, and S. Subramanian. The e ects of robot-performed co-verbal gesture on listener behaviour. In 11th IEEE-RAS International Conference on Humanoid Robots, pages 458{465. IEEE, Oct. 2011.
    • [5] J.-J. Cabibihan, W.-C. So, S. Saj, and Z. Zhang. Telerobotic Pointing Gestures Shape Human Spatial Cognition. International Journal of Social Robotics, 4(3):263{272, Apr. 2012.
    • [6] J. Cassell, H. H. Vilhjalmsson, and T. Bickmore. BEAT: the Behavior Expression Animation Toolkit. In Proceedings of the 28th annual conference on Computer graphics and interactive techniques - SIGGRAPH '01, pages 477{486. ACM Press, Aug. 2001.
    • [7] L. F. C. Clifton. Comprehension of Sluiced Sentences. Language and Cognitive Processes, 13(4):499{520, Aug. 1998.
    • [8] D. Dahan, M. K. Tanenhaus, and C. G. Chambers. Accent and reference resolution in spoken-language comprehension. Journal of Memory and Language, 47(2):292{314, Aug. 2002.
    • [9] E. T. Dijk, E. Torta, and R. H. Cuijpers. E ects of Eye Contact and Iconic Gestures on Message Retention in Human-Robot Interaction. International Journal of Social Robotics, 5(4):491{501, Sept. 2013.
    • [10] S. H. Fraundorf, D. G. Watson, and A. S. Benjamin. Recognition memory reveals just how CONTRASTIVE contrastive accenting really is. Journal of memory and language, 63(3):367{386, Oct. 2010.
    • [11] V. Gazzola, G. Rizzolatti, B. Wicker, and C. Keysers. The anthropomorphic brain: the mirror neuron system responds to human and robotic actions. NeuroImage, 35(4):1674{84, May 2007.
    • [12] D. Gouaillier, V. Hugel, P. Blazevic, C. Kilner, J. Monceaux, P. Lafourcade, B. Marnier, J. Serre, and B. Maisonnier. Mechatronic design of NAO humanoid. In 2009 IEEE International Conference on Robotics and Automation, pages 769{774. IEEE, May 2009.
    • [13] H. Holle, C. Obermeier, M. Schmidt-Kassow, A. D. Friederici, J. Ward, and T. C. Gunter. Gesture facilitates the syntactic analysis of speech. Frontiers in psychology, 3:74, Jan. 2012.
    • [14] C.-M. Huang and B. Mutlu. Learning-based modeling of multimodal behaviors for humanlike robots. In Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction - HRI '14, pages 57{64. ACM Press, Mar. 2014.
    • [15] T. F. Jaeger. Categorical Data Analysis: Away from ANOVAs (transformation or not) and towards Logit Mixed Models. Journal of memory and language, 59(4):434{446, Nov. 2008.
    • [16] S. D. Kelly, P. Creigh, and J. Bartolotti. Integrating speech and iconic gestures in a Stroop-like task: evidence for automatic processing. Journal of cognitive neuroscience, 22(4):683{94, Apr. 2010.
    • [17] A. Kendon. Gesture: Visible Action as Utterance. Cambridge University Press, 2004.
    • [18] E. Krahmer and M. Swerts. Testing the e ect of audiovisual cues to prominence via a reaction-time experiment. Proceedings of the International Conference on Spoken Language Processing (Interspeech 2006), 2006.
    • [19] E. Krahmer and M. Swerts. The e ects of visual beats on prosodic prominence: Acoustic analyses, auditory perception and visual perception. Journal of Memory and Language, 57(3):396{414, Oct. 2007.
    • [20] E.-K. Lee and D. G. Watson. E ects of pitch accents in attachment ambiguity resolution. Language and cognitive processes, 26(2):262{297, Jan. 2011.
    • [21] T. Leonard and F. Cummins. The temporal relation between beat gestures and speech. Language and Cognitive Processes, 26(10):1457{1471, Dec. 2011.
    • [22] D. McNeill. Hand and Mind: What Gestures Reveal about Thought. University of Chicago Press, 1992.
    • [23] T. Ono, T. Kanda, M. Imai, and H. Ishiguro. Embodied communications between humans and robots emerging from entrained gestures. In Proceedings 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation., volume 2, pages 558{563. IEEE, 2003.
    • [24] J. W. Peirce. PsychoPy{Psychophysics software in Python. Journal of neuroscience methods, 162(1-2):8{13, May 2007.
    • [25] M. Quigley, K. Conley, B. Gerkey, J. Faust, T. Foote, J. Leibs, R. Wheeler, and A. Ng. fROSg: an open-source Robot Operating System. In Open-Source Software workshop of the International Conference on Robotics and Automation (ICRA), 2009.
    • [26] M. Salem, F. Eyssel, K. Rohl ng, S. Kopp, and F. Joublin. To Err is Human(-like): E ects of Robot Gesture on Perceived Anthropomorphism and Likability. International Journal of Social Robotics, 5(3):313{323, May 2013.
    • [27] M. Salem, S. Kopp, I. Wachsmuth, K. Rohl ng, and F. Joublin. Generation and Evaluation of Communicative Robot Gesture. International Journal of Social Robotics, 4(2):201{217, Feb. 2012.
    • [28] A. Sauppe and B. Mutlu. Robot deictics. In Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction - HRI '14, pages 342{349. ACM Press, Mar. 2014.
    • [29] A. Schafer. Focus in Relative Clause Construal. Language and Cognitive Processes, 11(1-2):135{164, Apr. 1996.
    • [30] A. Schafer, K. Carlson, H. Clifton, and L. Frazier. Focus and the Interpretation of Pitch Accent: Disambiguating Embedded Questions. Language and Speech, 43(1):75{105, Mar. 2000.
    • [31] W. C. So, C. Sim Chen-Hui, and J. Low Wei-Shan. Mnemonic e ect of iconic gesture and beat gesture in adults and children: Is meaning in gesture important for memory recall? Language and Cognitive Processes, 27(5):665{681, June 2012.
    • [32] L. Tickle-Degnen and R. Rosenthal. The Nature of Rapport and Its Nonverbal Correlates. Psychological Inquiry, 1(4):285{293, Oct. 1990.
    • [33] A. M. von der Putten, N. C. Kramer, J. Gratch, and S.-H. Kang. \It doesn't matter what you are!" Explaining social e ects of agents and avatars. Computers in Human Behavior, 26(6):1641{1650, Nov. 2010.
    • [34] L. Wang and M. Chu. The role of beat gesture and pitch accent in semantic processing: an ERP study. Neuropsychologia, 51(13):2847{55, Nov. 2013.
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article