Please note: This PhD seminar will take place online.
Aseem Baranwal, PhD candidate
David R. Cheriton School of Computer Science
Supervisors: Professors Kimon Fountoulakis, Aukosh Jagannath
We present a rigorous theoretical understanding of the effects of graph convolutions in multi-layer networks. We study these effects through the node classification problem of a non-linearly separable Gaussian mixture model coupled with a stochastic block model. First, we show that a single graph convolution expands the regime of the distance between the means where multi-layer networks can classify the data by a factor of at least (1/D)^0.25, where D denotes the expected degree of a node. Second, we show that with a slightly stronger graph density, two graph convolutions improve this factor to at least 1/n^4, where n is the number of nodes in the graph. Finally, we provide both theoretical and empirical insights into the performance of graph convolutions placed in different combinations among the layers of a network, concluding that the performance is mutually similar for all combinations of the placement.
Based on the paper: Effects of Graph Convolutions in Multi-layer Networks. A. Baranwal, K. Fountoulakis, A. Jagannath. International Conference on Learning Representations, 2023.