PhD Seminar • Algorithms and Complexity • New Lower Bounds for Private Estimation and a Generalized Fingerprinting Lemma

Monday, October 3, 2022 3:00 pm - 4:00 pm EDT (GMT -04:00)

Please note: This PhD seminar will take place online and in DC 2310.

Argyris Mouzakis, PhD candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Gautam Kamath

We prove new lower bounds for statistical estimation tasks under the constraint of $(\varepsilon, \delta)$-differential privacy. First, we provide tight lower bounds for private covariance estimation of Gaussian distributions. We show that estimating the covariance matrix in Frobenius norm requires $\Omega(d^2)$ samples, and in spectral norm requires $\Omega(d^{\frac{3}{2}})$ samples, both matching upper bounds up to logarithmic factors. We prove these bounds via our main technical contribution, a broad generalization of the fingerprinting method to exponential families.

Additionally, using the private Assouad method of Acharya, Sun, and Zhang, we show a tight $\Omega(\frac{d}{\alpha^2 \varepsilon})$ lower bound for estimating the mean of a distribution with bounded covariance to $\alpha$-error in $\ell_2$-distance. Prior known lower bounds for all these problems were either polynomially weaker or held under the stricter condition of $(\varepsilon,0)$-differential privacy.

Based on joint work with Gautam Kamath and Vikrant Singhal. To appear in NeurIPS 2022.