Searching for just a few words should be enough to get started. If you need to make more complex queries, use the tips below to guide you.
Article type: Research Article
Authors: Alvim, Mário S.a; * | Andrés, Miguel E.b; ** | Chatzikokolakis, Konstantinosc | Degano, Pierpaolod | Palamidessi, Catusciab; e
Affiliations: [a] Computer Science Department, Universidade Federal de Minas Gerais, Belo Horizonte, Brazil | [b] LIX, École Polytechnique, Palaiseau, France | [c] CNRS, École Polytechnique, Palaiseau, France | [d] Dipartimento di Informatica, Università di Pisa, Pisa, Italy | [e] INRIA, École Polytechnique, Palaiseau, France
Correspondence: [*] Corresponding author. E-mail: msalvim@dcc.ufmg.br.
Note: [**] Current affiliation: Google, Inc.
Abstract: Differential privacy aims at protecting the privacy of participants in statistical databases. Roughly, a mechanism satisfies differential privacy if the presence or value of a single individual in the database does not significantly change the likelihood of obtaining a certain answer to any statistical query posed by a data analyst. Differentially-private mechanisms are often oblivious: first the query is processed on the database to produce a true answer, and then this answer is adequately randomized before being reported to the data analyst. Ideally, a mechanism should minimize leakage – i.e., obfuscate as much as possible the link between reported answers and individuals’ data – while maximizing utility – i.e., report answers as similar as possible to the true ones. These two goals, however, are in conflict with each other, thus imposing a trade-off between privacy and utility. In this paper we use quantitative information flow principles to analyze leakage and utility in oblivious differentially-private mechanisms. We introduce a technique that exploits graph symmetries of the adjacency relation on databases to derive bounds on the min-entropy leakage of the mechanism. We consider a notion of utility based on identity gain functions, which is closely related to min-entropy leakage, and we derive bounds for it. Finally, given some graph symmetries, we provide a mechanism that maximizes utility while preserving the required level of differential privacy.
Keywords: Differential privacy, information flow, min-entropy leakage, gain functions, optimal mechanisms
DOI: 10.3233/JCS-150528
Journal: Journal of Computer Security, vol. 23, no. 4, pp. 427-469, 2015
IOS Press, Inc.
6751 Tepper Drive
Clifton, VA 20124
USA
Tel: +1 703 830 6300
Fax: +1 703 830 2300
sales@iospress.com
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
IOS Press
Nieuwe Hemweg 6B
1013 BG Amsterdam
The Netherlands
Tel: +31 20 688 3355
Fax: +31 20 687 0091
info@iospress.nl
For editorial issues, permissions, book requests, submissions and proceedings, contact the Amsterdam office info@iospress.nl
Inspirees International (China Office)
Ciyunsi Beili 207(CapitaLand), Bld 1, 7-901
100025, Beijing
China
Free service line: 400 661 8717
Fax: +86 10 8446 7947
china@iospress.cn
For editorial issues, like the status of your submitted paper or proposals, write to editorial@iospress.nl
如果您在出版方面需要帮助或有任何建, 件至: editorial@iospress.nl