Please note: This PhD defence will be given online.
Guojun
Zhang,
PhD
candidate
David
R.
Cheriton
School
of
Computer
Science
Supervisors: Professors Pascal Poupart, Yaoliang Yu
Recent years have seen a surge of interest in building learning machines through adversarial training. In most cases, the formulation of adversarial training is through minimax optimization, or smooth games in a broader sense. There are mainly two focuses within recent minimax optimization research. One is on the solution concepts: what is a desirable solution concept that is both meaningful in practice and easy to compute? Another focus of recent research in the area of minimax optimization is about proposing stable and efficient algorithms. In this defence I will present my research work during my PhD that addresses these two problems, including the study of solution concepts, the stability criteria of gradient algorithms, and the convergence analysis of Newton-type methods.
To join this PhD defence on Zoom, please go to https://vectorinstitute.zoom.us/j/96838893661?pwd=UUlLekxNQzJjOFY0a29VNVppSG1MUT09.