Show simple item record

dc.contributor.authorDwork, Cynthia
dc.contributor.authorRothblum, Guy N.
dc.contributor.authorVadhan, Salil P.
dc.date.accessioned2011-05-24T15:06:59Z
dc.date.issued2010
dc.identifier.citationDwork, Cynthia, Guy N. Rothblum, and Salil Vadhan. 2010. Boosting and differential privacy. In Proceedings; 51st Annual IEEE Symposium on Foundations of Computer Science (FOCS), Las Vegas, Nevada, 23-26 October 2010. Los Alamitos, CA: IEEE Computer Society.en_US
dc.identifier.isbn978-1-4244-8525-3en_US
dc.identifier.issn0272-5428en_US
dc.identifier.urihttp://nrs.harvard.edu/urn-3:HUL.InstRepos:4894816
dc.description.abstractBoosting is a general method for improving the accuracy of learning algorithms. We use boosting to construct improved privacy-preserving synopses of an input database. These are data structures that yield, for a given set \(Q\) of queries over an input database, reasonably accurate estimates of the responses to every query in \(Q\), even when the number of queries is much larger than the number of rows in the database. Given a base synopsis generator that takes a distribution on \(Q\) and produces a “weak” synopsis that yields “good” answers for a majority of the weight in \(Q\), our Boosting for Queries algorithm obtains a synopsis that is good for all of \(Q\). We ensure privacy for the rows of the database, but the boosting is performed on the queries. We also provide the first synopsis generators for arbitrary sets of arbitrary low- sensitivity queries, i.e., queries whose answers do not vary much under the addition or deletion of a single row. In the execution of our algorithm certain tasks, each incurring some privacy loss, are performed many times. To analyze the cumulative privacy loss, we obtain an \(O(\varepsilon^2)\) bound on the expected privacy loss from a single \(\varepsilon\)-differentially private mechanism. Combining this with evolution of confidence arguments from the literature, we get stronger bounds on the expected cumulative privacy loss due to multiple mechanisms, each of which provides \(\varepsilon\)-differential privacy or one of its relaxations, and each of which operates on (potentially) different, adaptively chosen, databases.en_US
dc.description.sponsorshipEngineering and Applied Sciencesen_US
dc.language.isoen_USen_US
dc.publisherIEEE Computer Societyen_US
dc.relation.isversionofdoi:10.1109/FOCS.2010.12en_US
dc.relation.hasversionhttp://people.seas.harvard.edu/~salil/research/PrivateBoosting-focs.pdfen_US
dash.licenseMETA_ONLY
dc.titleBoosting and Differential Privacyen_US
dc.typeConference Paperen_US
dc.description.versionVersion of Recorden_US
dash.depositing.authorVadhan, Salil P.
dash.embargo.until10000-01-01
dc.identifier.doi10.1109/FOCS.2010.12*
dash.contributor.affiliatedVadhan, Salil


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record