You have just completed your registration at OpenAire.
Before you can login to the site, you will need to activate your account.
An e-mail will be sent to you with the proper instructions.
Important!
Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version
of the site upon release.
Graphstructures have proven computationally cumbersome for pattern analysis. The reason for this is that, before graphs can be converted to pattern vectors, correspondences must be established between the nodes of structures which are potentially of different size. To overcome this problem, in this paper, we turn to the spectral decomposition of the Laplacian matrix. We show how the elements of the spectral matrix for the Laplacian can be used to construct symmetric polynomials that are permutation invariants. The coefficients of these polynomials can be used as graph features which can be encoded in a vectorial manner. We extend this representation to graphs in which there are unary attributes on the nodes and binary attributes on the edges by using the spectral decomposition of a Hermitian property matrix that can be viewed as a complex analogue of the Laplacian. To embed the graphs in a pattern space, we explore whether the vectors of invariants can be embedded in a low- dimensional space using a number of alternative strategies, including principal components analysis ( PCA), multidimensional scaling ( MDS), and locality preserving projection ( LPP). Experimentally, we demonstrate that the embeddings result in well- defined graph clusters. Our experiments with the spectral representation involve both synthetic and real- world data. The experiments with synthetic data demonstrate that the distances between spectral feature vectors can be used to discriminate between graphs on the basis of their structure. The real- world experiments show that the method can be used to locate clusters of graphs.
[1] A.D. Bagdanov and M. Worring, “First Order Gaussian Graphs for Efficient Structure Classification,” Pattern Recogntion, vol. 36, pp. 1311-1324, 2003.
[2] M. Belkin and P. Niyogi, “Laplacian Eigenmaps for Dimensionality Reduction and Data Representation,” Neural Computation, vol. 15, no. 6, pp. 1373-1396, 2003.
[3] N. Biggs, Algebraic Graph Theory. Cambridge Univ. Press, 1993.
[4] P. Botti and R. Morris, “Almost All Trees Share a Complete Set of Inmanantal Polynomials,” J. Graph Theory, vol. 17, pp. 467-476, 1993.
[5] W.J. Christmas, J. Kittler, and M. Petrou, “Structural Matching in Computer Vision Using Probabilistic Relaxation,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 17, pp. 749-764, 1995.
[7] T. Cox and M. Cox, Multidimensional Scaling. Chapman-Hall, 1994.
[8] D. Cvetkovic, P. Rowlinson, and S. Simic, Eigenspaces of Graphs. Cambridge Univ. Press, 1997.
[9] S. Gold and A. Rangarajan, “A Graduated Assignment Algorithm for Graph Matching,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 18, pp. 377-388, 1996.
[10] X. He and P. Niyogi, “Locality Preserving Projections,” Advances in Neural Information Processing Systems 16, MIT Press, 2003.
[11] D. Heckerman, D. Geiger, and D.M. Chickering, “Learning Bayesian Networks: The Combination of Knowledge and Statistical Data,” Machine Learning, vol. 20, pp. 197-243, 1995.
[12] I.T. Jolliffe, Principal Components Analysis. Springer-Verlag, 1986.
[13] R. Kannan et al., “On Clusterings: Good, Bad and Spectral,” Proc. 41st Symp. Foundation of Computer Science, pp. 367-377, 2000.
[14] Y. Kesselman, A. Shokoufandeh, M. Demerici, and S. Dickinson, “Many-to-Many Graph Matching via Metric Embedding,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 850-857, 2003.
[15] B.B. Kimia, A.R. Tannenbaum, and S.W. Zucker, “Shapes, Shocks, and Deformations I. The Components of 2-Dimensional Shape and the Reaction-Diffusion Space,” Int'l J. Computer Vision, vol. 15, pp. 189-224, 1995.
[16] S. Kosinov and T. Caelli, “Inexact Multisubgraph Matching Using Graph Eigenspace and Clustering Models,” Proc. Joint IAPR Int'l Workshops Structural, Syntactic, and Statistical Pattern Recognition, SSPR 2002 and SPR 2002, pp. 133-142, 2002.
[17] L. Lovasz, “Random Walks on Graphs: A Survey,” Bolyai Soc. Math. Studies, vol. 2, pp. 1-46, 1993.
[18] B. Luo and E.R. Hancock, “Structural Matching Using the EM Algorithm and Singular Value Decomposition,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 23, pp. 1120-1136, 2001.
[19] B. Luo, A. Torsello, A. Robles-Kelly, R.C. Wilson, and E.R. Hancock, ”A Probabilistic Framework for Graph Clustering,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 912-919, 2001.
[20] B. Luo, R.C. Wilson, and E.R. Hancock, “Eigenspaces for Graphs,” Int'l J. Image and Graphics, vol. 2, pp. 247-268, 2002.
[22] A. Munger, H. Bunke, and X. Jiang, “Combinatorial Search vs. Genetic Algorithms: A Case Study Based on the Generalized Median Graph Problem,” Pattern Recognition Letters, vol. 20, pp. 1271-1279, 1999.
[23] M. Pavan and M. Pelillo, “Dominant Sets and Hierarchical Clustering,” Proc. Ninth IEEE Int'l Conf. Computer Vision, vol. I, pp. 362-369, 2003.
[24] P. Perona and W.T. Freeman, “A Factorization Approach to Grouping,” Proc. European Conf. Computer Vision, pp. 655-670, 1998.
[25] S. Rizzi, “Genetic Operators for Hierarchical Graph Clustering,” Pattern Recognition Letters, vol. 19, pp. 1293-1300, 1998.
[26] A. Robles-Kelly and E.R. Hancock, “An Expectation-Maximisation Framework for Segmentation and Grouping,” Image and Vision Computing, vol. 20, pp. 725-738, 2002.
[27] A. Robles-Kelly and E.R. Hancock, “Edit Distance from Graph Spectra,” Proc. Ninth IEEE Int'l Conf. Computer Vision, vol. I, pp. 127- 135, 2003.
[28] S. Roweis and L. Saul, “Non-Linear Dimensionality Reduction by Locally Linear Embedding,” Science, vol. 299, pp. 2323-2326, 2002.
[29] S. Sarkar and K.L. Boyer, “Quantitative Measures of Change Based on Feature Organization: Eigenvalues and Eigenvectors,” Computer Vision and Image Understanding, vol. 71, pp. 110-136, 1998.
[30] G.L. Scott and H.C. Longuet-Higgins, “Feature Grouping by Relocalisation of Eigenvectors of the Proximity Matrix,” Proc. British Machine Vision Conf., pp. 103-108, 1990.
[31] J. Segen, “Learning Graph Models of Shape,” Proc. Fifth Int'l Conf. Machine Learning, J. Laird, ed., pp. 29-25, 1988.
[32] K. Sengupta and K.L. Boyer, “Organizing Large Structural Modelbases,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 17, 1995.
[33] J. Shi and J. Malik, “Normalized Cuts and Image Segmentation,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 22, pp. 888-905, 2000.
[34] A. Shokoufandeh, S. Dickinson, K. Siddiqi, and S. Zucker, “Indexing Using a Spectral Coding of Topological Structure,” Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 491-497, 1999.
[35] K. Siddiqi, A. Shokoufandeh, S.J. Dickinson, and S.W. Zucker, “Shock Graphs and Shape Matching,” Int'l J. Computer Vision, vol. 35, pp. 13-32, 1999.
[36] J.B. Tenenbaum, V.D. Silva, and J.C. Langford, “A Global Geometric Framework for Non-Linear Dimensionality Reduction,” Science, vol. 290, pp. 586-591, 2000.
[37] A. Torsello and E.R. Hancock, “A Skeletal Measure of 2D Shape Similarity,” Proc. Fourth Int'l Workshop Visual Form, pp. 594-605, 2001.
[38] S. Umeyama, “An Eigen Decomposition Approach to Weighted Graph Matching Problems,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 10, pp. 695-703, 1988.
[39] B.J. van Wyk and M.A. van Wyk, “Kronecker Product Graph Matching,” Pattern Recognition, vol. 39, no. 9, pp. 2019-2030, 2003.