Seminar • Artificial Intelligence • Beyond the Black-Box: Faster Algorithms for Structured Optimization

Wednesday, February 26, 2025 10:30 am - 11:30 am EST (GMT -05:00)

Please note: This seminar will take place in DC 1304.

Swati Padmanabhan, Postdoctoral Researcher
EECS, Massachusetts Institute of Technology

Optimization is central to solving complex problems in data science, operations research, and control. As the challenges of scale, complexity, and resource constraints grow, optimization methods need to keep pace.

In this talk, I will present three recent results that illustrate how combining black-box optimization tools with problem-specific structure can significantly improve computational efficiency. First, I will describe a provably faster interior-point method [JKLPS20] for semidefinite programming, marking the first such advance since the pioneering work of Nesterov and Nemirovskii. Next, I will show how to develop an efficient first-order method [DDLPY22] for finite-dimensional, non-smooth, non-convex optimization by combining cutting-plane methods with underlying convex structure. Finally, I will briefly discuss how integrating the structure of RoS constraints into mirror descent achieves near-optimal regret bounds [FPW23] for RoS-constrained online advertising.


Bio: Swati Padmanabhan is a postdoctoral researcher at MIT, where she works with Ali Jadbabaie and Suvrit Sra on problems in continuous optimization arising in machine learning, control theory, and theoretical computer science.

She received her Ph.D. from the University of Washington in 2023, advised by Yin Tat Lee, with her dissertation focusing on algorithms for semidefinite programs, nonconvex optimization, and online optimization. Previously, she obtained her Master’s in Electrical Engineering from UCLA, where she developed algorithms for resource-efficient rapid diagnostic tests for malaria and then spent some time in the industry as a signal processing engineer.