LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Publisher: ACM
Languages: English
Types: Unknown
Subjects: QA76

Classified by OpenAIRE into

ACM Ref: MathematicsofComputing_DISCRETEMATHEMATICS
Many data analysis tasks rely on the abstraction of a graph to represent relations between entities, with attributes on the nodes and edges. Since the relationships encoded are often sensitive, we seek effective ways to release representative graphs which nevertheless protect the privacy of the data subjects. Prior work on this topic has focused primarily on the graph structure in isolation, and has not provided ways to handle richer graphs with correlated attributes.\ud \ud We introduce an approach to release such graphs under the strong guarantee of differential privacy. We adapt existing graph models, and introduce a new one, and show how to augment them with meaningful privacy. This provides a complete workflow, where the input is a sensitive graph, and the output is a realistic synthetic graph. Our experimental study demonstrates that our process produces useful, accurate attributed graphs.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • [1] L. Backstrom, C. Dwork, and J. Kleinberg. Wherefore art thou R3579X?: anonymized social networks, hidden patterns, and structural steganography. In WWW, 2007.
    • [2] J. Blocki, A. Blum, A. Datta, and O. Sheffet. Differentially private data analysis of social networks via restricted sensitivity. In ITCS, 2013.
    • [3] I. Cantador, P. Brusilovsky, and T. Kuflik. 2nd workshop on information heterogeneity and fusion in recommender systems (hetrec). In RecSys, 2011.
    • [4] R. Chen, B. C. Fung, S. Y. Philip, and B. C. Desai. Correlated network data publication via differential privacy. VLDB Journal, pages 1-24, 2013.
    • [5] S. Chen and S. Zhou. Recursive mechanism: towards node differential privacy and unrestricted joins. In SIGMOD, 2013.
    • [6] F. Chung and L. Lu. The average distances in random graphs with given expected degrees. volume 99, pages 15879-15882. National Acad Sciences, 2002.
    • [7] C. Dwork. Differential privacy. ICALP, 2006.
    • [8] C. Dwork, F. McSherry, K. Nissim, and A. Smith. Calibrating noise to sensitivity in private data analysis. In Theory of Cryptography, 2006.
    • [9] M. Hay. Enabling accurate analysis of private network data. PhD thesis, University of Massachusetts Amherst, 2010.
    • [10] M. Hay, G. Miklau, D. Jensen, D. Towsley, and P. Weis. Resisting structural re-identification in anonymized social networks. VLDB, 1(1):102-114, 2008.
    • [11] M. Hay, C. Li, G. Miklau, and D. Jensen. Accurate estimation of the degree distribution of private networks. In ICDM, 2009.
    • [12] Z. Jorgensen and T. Yu. A privacy-preserving framework for personalized, social recommendations. In EDBT, 2014.
    • [13] V. Karwa, S. Raskhodnikova, A. Smith, and G. Yaroslavtsev. Private analysis of graph structure. PVLDB, 2011.
    • [14] S. P. Kasiviswanathan, K. Nissim, S. Raskhodnikova, and A. Smith. Analyzing graphs with node differential privacy. In Theory of Cryptography, 2013.
    • [15] D. Kifer and B.-R. Lin. Towards an axiomatization of statistical privacy and utility. In PODS, 2010.
    • [16] T. G. Kolda, A. Pinar, T. Plantenga, and C. Seshadhri. A scalable generative graph model with community structure. SIAM J. Sci. Comput., 36(5), C424-C452, 2014.
    • [17] K. Liu and E. Terzi. Towards identity anonymization on graphs. In SIGMOD, 2008.
    • [18] W. Lu and G. Miklau. Exponential random graph estimation under differential privacy. In KDD, 2014.
    • [19] P. Massa and P. Avesani. Trust-aware bootstrapping of recommender systems. In Workshop on recommender systems, 2006.
    • [20] M. McPherson, L. Smith-Lovin, and J. M. Cook. Birds of a feather: Homophily in social networks. Annual review of sociology, pages 415-444, 2001.
    • [21] F. McSherry. Privacy integrated queries: an extensible platform for privacy-preserving data analysis. In SIGMOD, 2009.
    • [22] F. McSherry and K. Talwar. Mechanism design via differential privacy. In FOCS, 2007.
    • [23] D. J. Mir and R. N. Wright. A differentially private graph estimator. In ICDMW, 2009.
    • [24] A. Narayanan and V. Shmatikov. De-anonymizing social networks. In Security and Privacy, 2009.
    • [25] Nissim, Raskhodnikova, and Smith. Smooth sensitivity and sampling in private data analysis. In STOC, 2007.
    • [26] J. J. Pfeiffer, T. La Fond, S. Moreno, and J. Neville. Fast generation of large scale social networks while incorporating transitive closures. In (PASSAT), 2012.
    • [27] J. J. Pfeiffer III, S. Moreno, T. La Fond, J. Neville, and B. Gallagher. Attributed graph models: modeling network structure with correlated attributes. In WWW, 2014.
    • [28] A. Pinar, C. Seshadhri, and T. G. Kolda. The similarity between stochastic kronecker and chung-lu graph models. CoRR, abs/1110.4925, 2011.
    • [29] D. Proserpio, S. Goldberg, and F. McSherry. Calibrating data to sensitivity in private data analysis. In VLDB, 2014.
    • [30] A. Sala, X. Zhao, C. Wilson, H. Zheng, and B. Y. Zhao. Sharing graphs using differentially private graph models. In IMC, 2011.
    • [31] C. Seshadhri, T. G. Kolda, and A. Pinar. Community structure and scale-free collections of erdo˝ s-rényi graphs. Physical Review E, 85(5):056109, 2012.
    • [32] C. Seshadhri, A. Pinar, and T. G. Kolda. An in-depth study of stochastic kronecker graphs. In ICDM, 2011.
    • [33] E. Shen and T. Yu. Mining frequent graph patterns with differential privacy. In KDD, 2013.
    • [34] Y. Wang and X. Wu. Preserving differential privacy in degree-correlation based graph generation. Transactions on data privacy, 6(2):127, 2013.
    • [35] Y. Wang, X. Wu, J. Zhu, and Y. Xiang. On learning cluster coefficient of private networks. Social Network Analysis and Mining (3), pages 925-938, 2013.
    • [36] Q. Xiao, R. Chen, and K.-L. Tan. Differentially private network data release via structural inference. In KDD, 2014.
    • [37] J. Zhang, G. Cormode, C. M. Procopiuc, D. Srivastava, and X. Xiao. Private release of graph statistics using ladder functions. In SIGMOD, 2015.
    • [38] B. Zhou and J. Pei. Preserving privacy in social networks against neighborhood attacks. In ICDE, 2008.
    • [39] L. Zou, L. Chen, and M. T. Özsu. K-automorphism: A general framework for privacy preserving network publication. PVLDB (2), 946-957, 2009.
    • sensitivity of fQX (X ) (line 1) is 2. In line 3, we add independent Laplace noise with mean zero and scale equal to e2 , which gives e-differential privacy. (Laplace mechanism). The remainder of the algorithm, including clamping the noisy counts to the range (0; n), operates on the noisy counts, and so does not impact privacy.
  • Inferred research data

    The results below are discovered through our pilot algorithms. Let us know how we are doing!

    Title Trust
    54
    54%
  • No similar publications.

Share - Bookmark

Funded by projects

  • EC | DAPPER

Cite this article