Please note: This seminar will take place in DC 2568 and online.
Argyris Mouzakis, PhD candidate
David R. Cheriton School of Computer Science
We study person-level differentially private (DP) mean estimation in the case where each person holds multiple samples. DP here requires the usual notion of distributional stability when all of a person’s datapoints can be modified. Informally, if n people each have m samples from an unknown d-dimensional distribution with bounded k-th moments, we show that
n = \tilde{\Theta}(d/(α^2 m) + d/(α \sqrt{m} ε) +d/{α^{k / (k − 1)} m ε} +d/ε)
people are necessary and sufficient to estimate the mean up to distance α in ℓ2-norm under ε-differential privacy (and its common relaxations). In the multivariate setting, we give computationally efficient algorithms under approximate-DP and computationally inefficient algorithms under pure DP, and our nearly matching lower bounds hold for the most permissive case of approximate DP. Our computationally efficient estimators are based on the standard clip-and-noise framework, but the analysis for our setting requires both new algorithmic techniques and new analyses. In particular, our new bounds on the tails of sums of independent, vector-valued, bounded-moments random variables may be of interest.
The paper is joint work with Sushant Agarwal, Gautam Kamath, Mahbod Majid, Rose Silver, and Jon Ullman, and is set to appear in SODA 2025. It is also available on arxiv.
To join this seminar in person, please go to DC 2568. You can also attend virtually on Zoom.