Seminar • Algorithms and Complexity • On the Gradient Complexity of Private Optimization with Private Oracles

Wednesday, November 19, 2025 12:00 pm - 1:00 pm EST (GMT -05:00)

Please note: This seminar will take place in DC 1304 and online.

Michael Menart, Postdoctoral researcher
Department of Computer Science, University of Toronto

Convex optimization over finite sum losses is an important primitive in machine learning, and differential privacy is an increasingly important standard for trustworthy learning. As a result, a robust literature on differentially private convex optimization has emerged. The optimal utility rate for this problem has been known for over a decade [BST14], but a formal runtime characterization has remained elusive, with prior work focusing largely on improving runtime upper bounds.

In this talk, I will present work on some of the first runtime lower bounds for a class of differentially private optimizers. Specifically, we introduce the notion of private oracle methods, which are the predominant approach used in practice. This algorithm class includes, for example, the ubiquitous DP-SGD algorithm. For such methods, we provide tight runtime lower bounds in high dimensional regimes. Our results show that private oracle methods necessarily suffer a dimension dependent penalty in their runtime as compared to their non-private counterparts. Our results also characterize the impact of minibatch size in differentially private optimization, and show that small minibatches can lead to even larger runtime lower bounds. Our techniques are information theoretic in nature, and naturally extend to provide lower bounds for (non-private) gradient quantization schemes as well.

This talk is based on joint work with Aleksandar Nikolov.


To attend this seminar in person, please go to DC 1304. You can also attend virtually on Zoom.