000 08890nam a2200649 i 4500
001 7731575
003 IEEE
005 20200413152922.0
006 m eo d
007 cr cn |||m|||a
008 161108s2017 cau foab 000 0 eng d
020 _a9781627052979
_qebook
020 _z9781627054935
_qprint
024 7 _a10.2200/S00735ED1V01Y201609SPT018
_2doi
035 _a(CaBNVSL)swl00406915
035 _a(OCoLC)962271420
040 _aCaBNVSL
_beng
_erda
_cCaBNVSL
_dCaBNVSL
050 4 _aBF637.P74
_bL53 2017
082 0 4 _a155.92
_223
100 1 _aLi, Ninghui,
_eauthor.
245 1 0 _aDifferential privacy :
_bfrom theory to practice /
_cNinghui Li, Min Lyu, Dong Su, Weining Yang.
264 1 _a[San Rafael, California] :
_bMorgan & Claypool,
_c2017.
300 _a1 PDF (xiii, 124 pages)
336 _atext
_2rdacontent
337 _aelectronic
_2isbdmedia
338 _aonline resource
_2rdacarrier
490 1 _aSynthesis lectures on information security, privacy, and trust,
_x1945-9750 ;
_v# 18
538 _aMode of access: World Wide Web.
538 _aSystem requirements: Adobe Acrobat Reader.
500 _aPart of: Synthesis digital library of engineering and computer science.
504 _aIncludes bibliographical references (pages 113-121).
505 0 _a1. Introduction -- 1.1 Privacy violation incidents -- 1.1.1 Privacy incidents -- 1.1.2 Lessons from privacy incidents -- 1.2 On balancing theory and practice -- 1.3 Organization of this book -- 1.4 Topics for volume 2 --
505 8 _a2. A primer on [epsilon]-differential privacy -- 2.1 The definition of [epsilon]-DP -- 2.1.1 Bounded DP or unbounded DP -- 2.2 Properties of [epsilon]-DP -- 2.2.1 Post-processing and sequential composition -- 2.2.2 Parallel composition and convexity -- 2.3 The Laplace mechanism -- 2.3.1 The scalar case -- 2.3.2 The vector case -- 2.4 The exponential mechanism -- 2.4.1 The general case of the exponential mechanism -- 2.4.2 The monotonic case of the exponential mechanism -- 2.4.3 Case study: computing mode and median -- 2.4.4 Discussion on the exponential mechanism -- 2.5 Case study: computing average -- 2.5.1 Applying the Laplace and the exponential mechanism -- 2.5.2 Applying the Laplace mechanism and composition -- 2.5.3 A non-private average algorithm using accurate count -- 2.5.4 NoisyAverage with accurate count -- 2.5.5 NoisyAverage with normalization -- 2.5.6 Which is best -- 2.6 Settings to apply DP -- 2.7 Bibliographical notes --
505 8 _a3. What does DP mean? -- 3.1 Limitations of syntactic notions -- 3.2 Semantic guarantees of differential privacy -- 3.2.1 Infeasibility of achieving "privacy as secrecy" -- 3.2.2 Toward a "real-world-ideal-world" approach -- 3.2.3 DP as approximating the ideal world of "privacy as control" -- 3.2.4 A formulation of DP's semantic guarantee -- 3.2.5 The personal data principle -- 3.2.6 A case study in applying PDP -- 3.3 Examining DP and PDP -- 3.3.1 When the notion of neighboring datasets is defined incorrectly -- 3.3.2 When using DP in the local setting -- 3.3.3 What constitutes one individual's data -- 3.3.4 An individual's personal data or personal data under one individual's control -- 3.3.5 Group privacy as a potential legal Achilles' heel for DP -- 3.3.6 A moral challenge to private party benefiting from DP -- 3.4 Additional caveats when using DP -- 3.4.1 Using an [epsilon] that is too large -- 3.4.2 Applying a model to personal data -- 3.4.3 Privacy and discrimination -- 3.5 Bibliographical notes --
505 8 _a4. Publishing histograms for low-dimensional datasets -- 4.1 Problem definition -- 4.1.1 Three settings -- 4.1.2 Measuring utility -- 4.2 Dense pre-defined partitioning -- 4.2.1 The baseline: a simple histogram -- 4.2.2 The hierarchical method -- 4.2.3 Constrained inference -- 4.2.4 Effect of privacy budget allocation in hierarchical histograms -- 4.2.5 Wavelet transforms and other optimizations -- 4.2.6 Beyond one-dimensional datasets -- 4.3 Lacking suitable partitioning -- 4.3.1 The uniform grid method--UG -- 4.3.2 The adaptive grids approach--AG, 2D case -- 4.3.3 Bottom-up grouping -- 4.3.4 Recursive partitioning -- 4.4 Bibliographical notes --
505 8 _a5. Differentially private optimization -- 5.1 Example optimization problems -- 5.1.1 k-means clustering -- 5.1.2 Linear regression -- 5.1.3 Logistic regression -- 5.1.4 SVM -- 5.2 Objective perturbation -- 5.2.1 Adding a noisy linear term to the optimization objective function -- 5.2.2 The functional mechanism -- 5.3 Make an existing algorithm private -- 5.3.1 DPLloyd: differentially private Lloyd algorithm for k-means clustering -- 5.3.2 DiffPID3: differential private ID3 algorithm for decision tree classification -- 5.4 Iterative local search via EM -- 5.4.1 PrivGene: differentially private model fitting using genetic algorithms -- 5.4.2 Iterative local search -- 5.4.3 Enhanced exponential mechanism -- 5.5 Histograms optimized for optimization -- 5.5.1 Uniform grid and its extensions -- 5.5.2 Histogram publishing for estimating M-estimators -- 5.5.3 DiffGen: differentially private anonymization based on generalization -- 5.5.4 PrivPfC: differentially private data publication for classification -- 5.6 Bibliographical notes --
505 8 _a6. Publishing marginals -- 6.1 Problem definition -- 6.2 Methods that don't fit the problem -- 6.2.1 The flat method -- 6.2.2 The direct method -- 6.2.3 Adding noise in the Fourier domain -- 6.2.4 Data cubes -- 6.2.5 Multiplicative weights mechanism -- 6.2.6 Learning based approaches -- 6.3 The PriView approach -- 6.3.1 Summary of the PriView approach -- 6.3.2 Computing k-way marginals -- 6.3.3 Consistency between noisy views -- 6.3.4 Choosing a set of views -- 6.3.5 Space and time complexity -- 6.4 Bibliographical notes --
505 8 _a7. The sparse vector technique -- 7.1 Introduction -- 7.2 Variants of SVT -- 7.2.1 Privacy proof for proposed SVT -- 7.2.2 Privacy properties of other variants -- 7.2.3 Error in privacy analysis of GPTT -- 7.2.4 Other variants -- 7.3 Optimizing SVT -- 7.3.1 A generalized SVT algorithm -- 7.3.2 Optimizing privacy budget allocation -- 7.3.3 SVT for monotonic queries -- 7.4 SVT vs. EM -- 7.4.1 Evaluation -- 7.5 Bibliographical notes -- Bibliography -- Authors' biographies.
506 1 _aAbstract freely available; full-text restricted to subscribers or individual document purchasers.
510 0 _aCompendex
510 0 _aINSPEC
510 0 _aGoogle scholar
510 0 _aGoogle book search
520 3 _aOver the last decade, differential privacy (DP) has emerged as the de facto standard privacy notion for research in privacy-preserving data analysis and publishing. The DP notion offers strong privacy guarantee and has been applied to many data analysis tasks. This Synthesis Lecture is the first of two volumes on differential privacy. This lecture differs from the existing books and surveys on differential privacy in that we take an approach balancing theory and practice. We focus on empirical accuracy performances of algorithms rather than asymptotic accuracy guarantees. At the same time, we try to explain why these algorithms have those empirical accuracy performances. We also take a balanced approach regarding the semantic meanings of differential privacy, explaining both its strong guarantees and its limitations. We start by inspecting the definition and basic properties of DP, and the main primitives for achieving DP. Then, we give a detailed discussion on the semantic privacy guarantee provided by DP and the caveats when applying DP. Next, we review the state of the art mechanisms for publishing histograms for low-dimensional datasets, mechanisms for conducting machine learning tasks such as classification, regression, and clustering, and mechanisms for publishing information to answer marginal queries for high-dimensional datasets. Finally, we explain the sparse vector technique, including the many errors that have been made in the literature using it. The planned Volume 2 will cover usage of DP in other settings, including high-dimensional datasets, graph datasets, local setting, location privacy, and so on. We will also discuss various relaxations of DP.
530 _aAlso available in print.
588 _aTitle from PDF title page (viewed on November 8, 2016).
650 0 _aPrivacy
_xMathematical models.
653 _aprivacy
653 _aanonymization
700 1 _aLyu, Min,
_eauthor.
700 1 _aSu, Dong,
_eauthor.
700 1 _aYang, Weining,
_eauthor.
776 0 8 _iPrint version:
_z9781627054935
830 0 _aSynthesis digital library of engineering and computer science.
830 0 _aSynthesis lectures on information security, privacy, and trust ;
_v# 18.
_x1945-9750
856 4 2 _3Abstract with links to resource
_uhttp://ieeexplore.ieee.org/servlet/opac?bknumber=7731575
999 _c562228
_d562228