Thumbnail
Access Restriction
Subscribed

Author Evfimievski, Alexandre ♦ Fagin, Ronald ♦ Woodruff, David
Source ACM Digital Library
Content type Text
Publisher Association for Computing Machinery (ACM)
File Format PDF
Copyright Year ©2010
Language English
Subject Domain (in DDC) Computer science, information & general works ♦ Data processing & computer science
Subject Keyword Auditing ♦ Positivstellensatz ♦ Disclosure ♦ Privacy ♦ Query logs ♦ Reasoning about knowledge ♦ Supermodularity
Abstract We present a novel definition of privacy in the framework of offline (retroactive) database query auditing. Given information about the database, a description of sensitive data, and assumptions about users' prior knowledge, our goal is to determine if answering a past user's query could have led to a privacy breach. According to our definition, an audited property $\textit{A}$ is private, given the disclosure of property $\textit{B},$ if no user can gain confidence in $\textit{A}$ by learning $\textit{B},$ subject to prior knowledge constraints. Privacy is not violated if the disclosure of $\textit{B}$ causes a loss of confidence in $\textit{A}.$ The new notion of privacy is formalized using the well-known semantics for reasoning about knowledge, where logical properties correspond to sets of possible worlds (databases) that satisfy these properties. Database users are modeled as either possibilistic agents whose knowledge is a set of possible worlds, or as probabilistic agents whose knowledge is a probability distribution on possible worlds. We analyze the new privacy notion, show its relationship with the conventional approach, and derive criteria that allow the auditor to test privacy efficiently in some important cases. In particular, we prove characterization theorems for the possibilistic case, and study in depth the probabilistic case under the assumption that all database records are considered a-priori independent by the user, as well as under more relaxed (or absent) prior-knowledge assumptions. In the probabilistic case we show that for certain families of distributions there is no efficient algorithm to test whether an audited property $\textit{A}$ is private given the disclosure of a property $\textit{B},$ assuming $\textit{P}$ ≠ $\textit{NP}.$ Nevertheless, for many interesting families, such as the family of product distributions, we obtain algorithms that are efficient both in theory and in practice.
ISSN 00045411
Age Range 18 to 22 years ♦ above 22 year
Educational Use Research
Education Level UG and PG
Learning Resource Type Article
Publisher Date 2010-12-01
Publisher Place New York
e-ISSN 1557735X
Journal Journal of the ACM (JACM)
Volume Number 58
Issue Number 1
Page Count 45
Starting Page 1
Ending Page 45


Open content in new tab

   Open content in new tab
Source: ACM Digital Library