Remember Me
Or use your Academic/Social account:


Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Martinez-Hernandez, U.; Dodd, T.J.; Prescott, T.J.; Lepora, N.F. (2013)
Publisher: Springer Berlin Heidelberg
Languages: English
Types: Other
Over the past few decades the design of robots has gradually improved, allowing them to perform complex tasks in interaction with the world. To behave appropriately, robots need to make perceptual decisions about their environment using their various sensory modalities. Even though robots are being equipped with progressively more accurate and advanced sensors, dealing with uncertainties from the world and their sensory processes remains an unavoidable necessity for autonomous robotics. The challenge is to develop robust methods that allow robots to perceive their environment while managing uncertainty and optimizing their decision making. These methods can be inspired by the way humans and animals actively direct their senses towards locations for reducing uncertainties from perception [1]. For instance, humans not only use their hands and fingers for exploration and feature extraction but also their movements are guided according to what it is being perceived [2]. This behaviour is also present in the animal kingdom, such as rats that actively explore the environment by appropriately moving their whiskers [3]. © 2013 Springer-Verlag Berlin Heidelberg.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • 1. R. Bajcsy. Active perception. Proceedings of the IEEE, 76(8):9661005, 1988.
    • 2. S. J. Lederman and R. L. Klatzky. Hand movements: A window into haptic object recognition. Cognitive Psychology, 19(19):342368, 1987.
    • 3. T. J. Prescott, M. E. Diamond, and A. M. Wing. Active touch sensing. Phil. Trans. R. Soc. B, 366:29892995, 2011.
    • 4. N.F. Lepora, M. Evans, C.W. Fox, M.E. Diamond, K. Gurney, and T.J. Prescott. Naive Bayes texture classification applied to whisker data from a moving robot. In Neural Networks (IJCNN), The 2010 International Joint Conference on, pages 1-8, 2010.
    • 5. N.F. Lepora, C.W. Fox, M. Evans, M.E. Diamond, K. Gurney, and T.J. Prescott. Optimal decision-making in mammals: Insights from a robot study of rodent texture discrimination. Journal of The Royal Society Interface, 9 (72): 1517-1528, 2012.
    • 6. U. Martinez-Hernandez, T.J. Dodd, Lorenzo Natale, Giorgio Metta, T.J. Prescott, and N.F. Lepora. Active contour following to explore object shape with robot touch. In World Haptics (WHC), 2013 IEEE International Conference on, 2013.
    • 7. N.F. Lepora, U. Martinez-Hernandez, and T.J. Prescott. Active touch for robust perception under position uncertainty. In Robotics and Automation (ICRA), 2013 IEEE International Conference on, 2013
    • 8. N.F. Lepora, U. Martinez-Hernandez, and T.J. Prescott. Active bayesian perception for simultaneous object localization and identification. (under review)
    • 9. U. Martinez-Hernandez, T.J. Dodd, T.J. Prescott, and N.F. Lepora. Active Bayesian perception for angle and position discrimination with a biomimetic fingertip. (under review).
  • No related research data.
  • No similar publications.

Share - Bookmark

Funded by projects

  • EC | EFAA

Cite this article