Remember Me
Or use your Academic/Social account:


Or use your Academic/Social account:


You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.


Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message


Verify Password:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Wild, Peter; Hofbauer, Heinz; Ferryman, James; Uhl, Andreas (2015)
Languages: English
Types: Unknown

Classified by OpenAIRE into

This paper investigates the potential of fusion at normalisation/segmentation level prior to feature extraction. While there are several biometric fusion methods at data/feature level, score level and rank/decision level combining raw biometric signals, scores, or ranks/decisions, this type of fusion is still in its infancy. However, the increasing demand to allow for more relaxed and less invasive recording conditions, especially for on-the-move iris recognition, suggests to further investigate fusion at this very low level. This paper focuses on the approach of multi-segmentation fusion for iris biometric systems investigating the benefit of combining the segmentation result of multiple normalisation algorithms, using four methods from two different public iris toolkits (USIT, OSIRIS) on the public CASIA and IITD iris datasets. Evaluations based on recognition accuracy and ground truth segmentation data indicate high sensitivity with regards to the type of errors made by segmentation algorithms.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • [1] H. Proenc¸a and L. Alexandre, “Toward covert iris biometric recognition: Experimental results from the NICE contests,” IEEE Trans. Inf. For. & Sec., vol. 7, no. 2, 2012.
    • [2] A. Ross, R. Jillela, J. Smereka, V. Boddeti, B. Kumar, R. Barnard, X. Hu, P. Pauca, and R. Plemmons, “Matching highly non-ideal ocular images: An information fusion approach,” in Proc. Int'l Conf. on Biometrics (ICB), 2012.
    • [3] F. Alonso-Fernandez and J. Bigun, “Quality factors affecting iris segmentation and matching,” in Proc. Int'l Conf. on Biometrics (ICB), 2013.
    • [4] A. A. Ross, K. Nandakumar, and A. K. Jain, Handbook of Multibiometrics. Springer, 2006.
    • [5] A. Uhl and P. Wild, “Fusion of iris segmentation results,” in Proc. 18th Ib. Congr. on Pattern Recog, (CIARP), 2013.
    • [6] J. Daugman, “How iris recognition works,” IEEE Trans. on Circiuts and Systems for Video Technology, vol. 14, no. 1, 2004.
    • [7] H. Hofbauer, C. Rathgeb, A. Uhl, and P. Wild, “Image metric-based biometric comparators: A supplement to feature vector-based hamming distance?” in Proc. Int'l Conf. Biom. Special Int. Group (BIOSIG), 2012.
    • [8] E. Llano, J. Vargas, M. Garca-Vzquez, L. Fuentes, and A. RamrezAcosta, “Cross-sensor iris verification applying robust fused segmentation algorithms,” in Proc. Int'l Conf. on Biometrics (ICB), 2015, 2015, pp. 1-6.
    • [9] Y. Sanchez-Gonzalez, Y. Cabrera, and E. Llano, “A comparison of fused segmentation algorithms for iris verification,” in Proc. Ib. Congr. Patt. Rec., (CIARP), 2014.
    • [10] R. P. Wildes, “Iris recognition: an emerging biometric technology,” in Proc. of the IEEE, vol. 85, 1997.
    • [11] A. Abhyankar and S. Schuckers, “Active shape models for effective iris segmentation,” in Proc. of SPIE, 2006.
    • [12] T. Tan, Z. He, and Z. Sun, “Efficient and robust segmentation of noisy iris images for non-cooperative iris recognition,” Image and Vision Computing, vol. 28, no. 2, 2010.
    • [13] Z. He, T. Tan, Z. Sun, and X. Qiu, “Toward accurate and fast iris segmentation for iris biometrics,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 31, no. 9, 2009.
    • [14] R. Labati, V. Piuri, and F. Scotti, “Agent-based image iris segmentation and multipleviews boundary refining,” in Proc. Int'l Conf. Biom.: Th., Appl. Syst. (BTAS), 2009.
    • [15] G. Sutra, S. Garcia-Salicetti, and B. Dorizzi, “The Viterbi algorithm at different resolutions for enhanced iris segmentation,” in Proc. Int'l Conf. Biom. (ICB), 2012.
    • [16] A. Uhl and P. Wild, “Weighted adaptive hough and ellipsopolar transforms for real-time iris segmentation,” in Proc. Int'l Conf. on Biometrics (ICB), 2012.
    • [17] H. Hofbauer, F. Alonso-Fernandez, P. Wild, J. Bigun, and A. Uhl, “A ground truth for iris segmentation,” in Proc. 22nd Int'l Conf. Pattern Rec. (ICPR), 2014.
    • [18] J. Huang, L. Ma, T. Tan, and Y. Wang, “Learning based resolution enhancement of iris images,” in Proc. BMVC, 2003.
    • [19] K. Hollingsworth, T. Peters, K. Bowyer, and P. Flynn, “Iris recognition using signal-level fusion of frames from video,” IEEE Trans. Inf. For. Sec., vol. 4, no. 4, 2009.
    • [20] R. Jillela, A. Ross, and P. Flynn, “Information fusion in low-resolution iris videos using principal components transform,” in IEEE WS Appl. Comp. Vis. (WACV), 2011.
    • [21] Q. McNemar, “Note on the sampling error of the difference between correlated proportions of percentages,” Psychometrika, vol. 12, no. 2, pp. 153-157, 1947.
    • [22] A. Fitzgibbon, M. Pilu, and R. B. Fisher, “Direct least square fitting of ellipses,” IEEE Trans. Pat. An. Ma. Int., vol. 21, no. 5, 1999.
    • [23] C. Rathgeb, A. Uhl, and P. Wild, Iris Recognition: From Segmentation to Template Security, ser. Advances in Information Security. Springer, 2012, vol. 59.
    • [24] D. Petrovska and A. Mayoue, “Description and documentation of the biosecure software library,” Project No IST-2002-507634 - BioSecure, Tech. Rep., 2007.
    • [25] A. Uhl and P. Wild, “Multi-stage visible wavelength and near infrared iris segmentation framework,” in Proc. Int'l Conf. Image An. Rec. (ICIAR), ser. LNCS, 2012.
    • [27] L. Ma, T. Tan, Y. Wang, and D. Zhang, “Efficient iris recognition by characterizing key local variations,” IEEE Trans. Image Proc., vol. 13, no. 6, 2004.
  • Inferred research data

    The results below are discovered through our pilot algorithms. Let us know how we are doing!

    Title Trust
  • No similar publications.

Share - Bookmark

Funded by projects

  • FWF | Biometric Sensor Forensics

Cite this article