PhD Seminar • Machine Learning • Inherent Limitations for Characterizing Distribution Learning

Friday, May 24, 2024 11:30 am - 12:30 pm EDT (GMT -04:00)

Please note: This PhD seminar will take place online.

Tosca Lechner, PhD candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Shai Ben-David

We consider the long-standing question of finding a parameter of a class of probability distributions that characterizes its PAC learnability. While for many learning tasks (such as binary classification and online learning) there is a notion of dimension whose finiteness is equivalent to learnability within any level of accuracy, we show, rather surprisingly, that such parameter exists for distribution learning.

Concretely, our results apply for several general notions of characterizing learnability and for several learning tasks. We show that there is no notion of scale-invariant dimension that characterizes the sample complexity of learning distribution classes. We then consider the weaker requirement of only characterizing learnability (rather than the quantitative sample complexity function). We propose some natural requirements for such a characterizing dimension and go on to show that there exists no characterization of learnability that satisfies these requirements for classes of distributions.

To attend this PhD seminar on Zoom, please go to