LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Bingham, Mark; Taylor, D.; Gledhill, Duke; Xu, Zhijie (2008)
Publisher: University of Huddersfield
Languages: English
Types: Part of book or chapter of book
Subjects: T1

Classified by OpenAIRE into

ACM Ref: ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION
A number of issues need to be taken into consideration for the output of an augmented reality application to appear realistic. Such issues include the level of detail of the virtual scene, the accuracy of alignment between the two worlds and the correspondence of illumination between the real and virtual components. This paper reviews registration techniques used to perform the alignment process and discusses a research project that investigates methods matching light conditions. Although a number of registration techniques exist, feature based registration has not yet achieved popularity within production environments because of the associated difficulties. Instead, fiducial marker based systems are being implemented for their robustness and ease of implementation. Further work to be undertaken is discussed towards the end of the paper.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • [2] R. Behringer. Registration for outdoor augmented reality applications using computer vision techniques and hybrid sensors. In Proceedings of IEEE Virtual Reality, page 244, 1999.
    • [3] V. Coors, T. Huch, and U. Kretschmer. Matching buildings: pose estimation in an urban environment. isar, 00:89, 2000.
    • [4] K. Cornelis, M. Pollefeys, M. Vergauwen, and L. Gool. Augmented reality using uncalibrated video sequences. Lecture Notes in Computer Science, 2018:144-160, 2001.
    • [5] T. Cowper and M. Buerger. Improving our view of the world: Police and augmented reality technology.
    • A. Davison. Real-time simultaneous localisation and mapping with a single camera, 2003.
    • [7] D. Devarajan, R. Radke, and H. Chung. Distributed metric calibration of ad hoc camera networks. Transactions on Sensor Networks, 2(3):380-403, August 2006.
    • [8] D. Gledhill, G. Tian, and D. Taylor. Comprehensive interest points based imaging mosaic. Pattern Recognition Letters, 24(9-10):1171- 1179, 2003.
    • [9] I. Gordon and D. Lowe. Scene modelling, recognition and tracking with invariant image features. In Proceedings of the 3rd International Symposium on Mixed and Augmented Reality, pages 110-119, 2004.
    • [10] I. Gordon and D. Lowe. What and where: 3d object recognition with accurate pose. In Toward Category-Level Object Recognition, volume 41, pages 67-82, 2006.
    • [11] C. Harris and M. Stephens. A combined corner and edge detector. In 4th Alvey Vision Conference, pages 147-151, 1988.
    • [12] R. Hirose and H. Saito. A vision-based ar registration method utilizing edges and vertices of 3d model. In Proceedings of the 2005 international conference on Augmented tele-existence, pages 187-194, December 2005.
    • [13] W. Hoff, T. Lyon, and K. Nguyen. Computer vision-based registration techniques for augmented reality, 1996.
    • [14] H. Kato and M. Billinghurst. Marker tracking and hmd calibration for a video-based augment edreality conferencing system. In Proceedings of the 2nd International Workshop on Augmented Reality, pages 85-94, 1999.
    • [15] S. Kim, S. Diverdi, J. Chang, T. Kang, Ronald Iltis, and Tobias Hollerer. Implicit 3d modeling and tracking for anywhere augmentation. In Proceedings of the 2007 Symposium on Virtual Reality Software and technology, 2007.
    • [16] J. Lee, S. You, and U. Neumann. Tracking with omni-directional vision for outdoor ar systems. In Proceedings of the 1st International Symposium on Mixed and Augmented Reality, page 47. Integrated Media Systems Center, 2002.
    • [17] V. Lepetit, L. Vacchetti, D. Thalmann, and P. Fua. Fully automated and stable registration for augmented reality applications. In Proceedings of the 2nd International Symposium on Mixed and Augmented Reality, page 93, 2003.
    • [18] D. Lowe. Distinctive image features from scale invariant keypoints. International Journal of Computer Vision, 60(2):91-110, 2004.
    • [19] W. Piekarski and B. Thomas. The tinmith system: demonstrating new techniques for mobile augmented reality modelling. Aust. Comput. Sci. Commun., 24(4):61-70, 2002.
    • [20] G. Reitmayr and T. Drummond. Going out: robust model-based tracking for outdoor augmented reality. In Proceedings of the 5th International Symposium on Mixed and Augmented Reality, pages 109-118, 2006.
    • [21] E. Rosten and T. Drummond. Fusing points and lines for high performance tracking. In IEEE International Conference on Computer Vision, volume 2, pages 1508-1511, October 2005.
    • [22] G. Simon, A. Fitzgibbon, and A. Zisserman. Markerless tracking using planar structures in the scene. In Proceedings of the International Symposium on Augmented Reality, 2000.
    • [23] B. Thomas, B. Close, J. Donoghue, J. Squires, P. Bondi, and W. Piekarski. First person indoor/outdoor augmented reality application: Arquake. Personal Ubiquitous Computing, 6(1):75-86, 2002.
    • [24] W. Thompson. Geometric reasoning for map-based localization, 1996.
    • [25] M. Tuceryan, D. Greer, R. Whitaker, D. Breen, C. Crampton, E. Rose, and K. Ahlers. Calibration requirements and procedures for a monitorbased augmented reality system. IEEE Transactions on Visualization and Computer Graphics,1(3):255-273, 1995.
    • [26] X. Zhang, S. Fronz, and N. Navab. Visual marker detection and decoding in ar systems: A comparative strudy. In Proceedings of the International Symposium on Mixed and Augmented Reality, 2002.
    • [27] Y. Feng. Estimation of Light Source Environment for Illumination Consistency of Augmented Reality. Proceedings of the 2008 IEEE Congress on Image and Signal Processing, 2008
    • [28] K. Jacobs and C. Angus and C. Loscos. Automatic Generation of Consistent Shadows for Augmented Reality. In Proceedings of Graphics Interface, 2005
    • [29] K. Jacobs and C. Loscos. Classification of Illumination Methods for Mixed Reality. In Proceedings of the Computer Graphics Forum, 2006
    • [30] G. Patow and X. Pueyo. A Survey of Inverse Rendering Problems. In Proceedings of the Computer Graphics Forum, 2004
    • [31] A. State, G. Hirota, D. Chen, W. Garrett, M. Livingston. Superior Augmented Reality Registration by Integrating Landmark Tracking and Magnetic Tracking. In Proceedings of the Computer Graphics Forum, 2006
    • [32] M. Haller and S. Drab and W. Hartmann. A real-time shadow approach for an augmented reality application using shadow volumes. In Proceedings of the ACM symposium on Virtual reality software and technology, 2003
    • [33] J. Yao and Z. Zhang. Systematic Static Shadow Detection. Proceedings of the 17th IEEE international conference on pattern recognition, 2004
    • [35] Y. Wang and D. Samaras. Estimation of multiple directional light sources for synthesis of augmented reality images. Special issue on Pacific graphics, 2003
    • [36] X. Zhang and S. Fronz and N. Navab. Visual Marker Detection and Decoding in AR Systems: A Comparative Study. In Proceedings of the International Symposium on Mixed and Augmented Reality, 2002
  • No related research data.
  • No similar publications.

Share - Bookmark

Cite this article