Testing differential privacy with dual interpreters
Proceedings of the ACM on Programming Languages, 2020•dl.acm.org
Applying differential privacy at scale requires convenient ways to check that programs
computing with sensitive data appropriately preserve privacy. We propose here a fully
automated framework for testing differential privacy, adapting a well-known “pointwise”
technique from informal proofs of differential privacy. Our framework, called DPCheck,
requires no programmer annotations, handles all previously verified or tested algorithms,
and is the first fully automated framework to distinguish correct and buggy implementations …
computing with sensitive data appropriately preserve privacy. We propose here a fully
automated framework for testing differential privacy, adapting a well-known “pointwise”
technique from informal proofs of differential privacy. Our framework, called DPCheck,
requires no programmer annotations, handles all previously verified or tested algorithms,
and is the first fully automated framework to distinguish correct and buggy implementations …
Applying differential privacy at scale requires convenient ways to check that programs computing with sensitive data appropriately preserve privacy. We propose here a fully automated framework for testing differential privacy, adapting a well-known “pointwise” technique from informal proofs of differential privacy. Our framework, called DPCheck, requires no programmer annotations, handles all previously verified or tested algorithms, and is the first fully automated framework to distinguish correct and buggy implementations of PrivTree, a probabilistically terminating algorithm that has not previously been mechanically checked.
We analyze the probability of DPCheck mistakenly accepting a non-private program and prove that, theoretically, the probability of false acceptance can be made exponentially small by suitable choice of test size.
We demonstrate DPCheck’s utility empirically by implementing all benchmark algorithms from prior work on mechanical verification of differential privacy, plus several others and their incorrect variants, and show DPCheck accepts the correct implementations and rejects the incorrect variants.
We also demonstrate how DPCheck can be deployed in a practical workflow to test differentially privacy for the 2020 US Census Disclosure Avoidance System (DAS).
ACM Digital Library
Showing the best result for this search. See all results