|Abstract||Differential privacy is a cryptographically- motivated definition of privacy which has gained significant attention over the past few years. Differentially private solutions enforce privacy by adding random noise to a function computed over the data, and the challenge in designing such algorithms is to control the added noise in order to optimize the privacy- accuracy-sample size tradeoff.
This work studies differentially-private statis- tical estimation, and shows upper and lower bounds on the convergence rates of differen- tially private approximations to statistical es- timators. Our results reveal a formal con- nection between differential privacy and the notion of Gross Error Sensitivity (GES) in robust statistics, by showing that the con- vergence rate of any differentially private ap- proximation to an estimator that is accurate over a large class of distributions has to grow with the GES of the estimator. We then pro- vide an upper bound on the convergence rate of a differentially private approximation to an estimator with bounded range and bounded GES. We show that the bounded range condi- tion is necessary if we wish to ensure a strict form of differential privacy. |