LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Coppola, Claudio; Faria, Diego R.; Nunes, Urbano; Bellotto, Nicola (2016)
Publisher: IEEE
Languages: English
Types: Part of book or chapter of book
Subjects:
Social activity based on body motion is a key feature for non-verbal and physical behavior defined as function for communicative signal and social interaction between individuals. Social activity recognition is important to study human-human communication and also human-robot interaction. Based on that, this research has threefold goals: (1) recognition of social behavior (e.g. human-human interaction) using a probabilistic approach that merges spatio-temporal features from individual bodies and social features from the relationship between two individuals; (2) learn priors based on physical proximity between individuals during an interaction using proxemics theory to feed a probabilistic ensemble of activity classifiers; and (3) provide a public dataset with RGB-D data of social daily activities including risk situations useful to test approaches for assisted living, since this type of dataset is still missing. Results show that using the proposed approach designed to merge features with different semantics and proximity priors improves the classification performance in terms of precision, recall and accuracy when compared with other approaches that employ alternative strategies.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • [1] G. Yu, Z. Liu, and J. Yuan, “Discriminative orderlet mining for realtime recognition of human-object interaction,” in Computer VisionACCV 2014. Springer, 2014, pp. 50-65.
    • [2] J. Sung, C. Ponce, B. Selman, and A. Saxena, “Unstructured human activity detection from RGBD images,” in ICRA'12, 2012.
    • [3] L. Xia and J. Aggarwal, “Spatio-temporal depth cuboid similarity feature for activity recognition using depth camera,” in CVPR, 2013.
    • [4] H. S. Koppula, R. Gupta, and A. Saxena, “Learning human activities and object affordances from RGB-D videos,” in IJRR journal, 2012.
    • [5] J. Wang, Z. Liu, Y. Wu, and J. Yuan, “Learning actionlet ensemble for 3D human action recognition,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 36, no. 5, pp. 914-927, 2014.
    • [6] C. Coppola, O. Martinez Mozos, N. Bellotto et al., “Applying a 3d qualitative trajectory calculus to human action recognition using depth cameras,” IEEE/RSJ IROS Workshop on Assistance and Service Robotics in a Human Environment, 2015.
    • [7] D. R. Faria, C. Premebida, and U. Nunes, “A probalistic approach for human everyday activities recognition using body motion from RGB-D images,” in IEEE RO-MAN'14, 2014.
    • [8] D. R. Faria, M. Vieira, C. Premebida, and U. Nunes, “Probabilistic human daily activity recognition towards robot-assisted living,” in IEEE RO-MAN'15: IEEE Int. Symposium on Robot and Human Interactive Communication. Kobe, Japan., 2015.
    • [9] G. Parisi, C. Weber, and S. Wermter, “Self-organizing neural integration of pose-motion features for human action recognition,” Name: Frontiers in Neurorobotics, vol. 9, no. 3, 2015.
    • [10] L. Piyathilaka and S. Kodagoda, “Human activity recognition for domestic robots,” in Field and Service Robotics. Springer, 2015.
    • [11] M. Va´zquez, A. Steinfeld, and S. E. Hudson, “Parallel detection of conversational groups of free-standing people and tracking of their lower-body orientation,” in IEEE IROS'15, Germany, 2015.
    • [12] K. Khoshhal Roudposhti, U. Nunes, and J. Dias, “Probabilistic social behavior analysis by exploring body motion-based patterns,” IEEE Trans. PAMI, 2015.
    • [13] R. Mead, A. Atrash, and M. J. Mataric, “Automated proxemic feature extraction and behavior recognition: Applications in human-robot interaction,” in Int. Journal of Social Robotics. Springer, 2013.
    • [14] E. T. Hall, “A system for the notation of proxemic behavior,” American Anthropologist, 1963.
    • [15] I. Chakraborty, H. Cheng, and O. Javed, “3d visual proxemics: Recognizing human interactions in 3d from a single image,” in IEEE CVPR, June 2013, pp. 3406-3413.
    • [16] L. Takayama and C. Pantofaru, “Influences on proxemic behaviors in human-robot interaction,” in IEEE IROS'09, 2009.
    • [17] Y. Kim and B. Mutlu, “How social distance shapes human-robot interaction,” Int. Journal of Human-Computer Studies, 2014.
    • [18] M. Vieira, D. R. Faria, and U. Nunes, “Real-time application for monitoring human daily activities and risk situations in robot-assisted living,” in Robot'15: 2nd Iberian Robotics Conf., 2015.
    • [19] C. Premebida, D. R. Faria, F. A. Souza, and U. Nunes, “Applying probabilistic mixture models to semantic place classification in mobile robotics,” in IEEE IROS'15, Germany, 2015.
    • [20] C. Premebida, D. R. Faria, and U. Nunes, “Dynamic bayesian network for semantic place classification in mobile robotics,” AURO Springer: Autonomous Robots, 2016.
    • [21] J. Vital, D. R. Faria, G. Dias, M. Couceiro, F. Coutinho, and N. Ferreira, “Combining discriminative spatio-temporal features for daily life activity recognition using wearable motion sensing suit,” PAA Springer: Pattern Analysis and Applications, 2016.
    • [22] C.-C. Chang and C.-J. Lin, “LIBSVM: A library for support vector machines,” ACM TIST, 2011, http://www.csie.ntu.edu.tw/ cjlin/libsvm.
    • [23] “ISR-UoL 3D Social Activity Dataset. Web: https://lcas.lincoln.ac.uk/wp/isr-uol-3d-social-activity-dataset.”
  • No related research data.
  • No similar publications.

Share - Bookmark

Funded by projects

  • EC | ENRICHME

Cite this article