PhD Seminar • Algorithms and Complexity — Geodesic Convexity in Statistics and Sample Complexity Bounds

Wednesday, March 17, 2021 11:00 am - 11:00 am EDT (GMT -04:00)

Please note: This PhD seminar will be given online.

Akshay Ramachandran, PhD candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Lap Chi Lau

The matrix normal model, the family of Gaussian matrix-variate distributions whose covariance matrix is the Kronecker product of two lower dimensional factors, is frequently used to model matrix-variate data. The tensor normal model generalizes this family to Kronecker products of three or more factors. 

We study the estimation of the Kronecker factors of the covariance matrix in the matrix and tensor models. We show nonasymptotic bounds for the maximum likelihood estimator (MLE) for the factors in several natural metrics. In contrast to existing bounds, our results do not depend on the factors being well-conditioned. For the matrix normal model, all our bounds are minimax optimal up to logarithmic factors, and for the tensor normal model our bound for the largest factor and overall covariance matrix are minimax optimal provided there are enough samples for any estimator to obtain better than constant Frobenius error. In the same regimes as our sample complexity bounds, we show that an iterative procedure to compute the MLE known as the flip-flop algorithm converges linearly with high probability. Our main tool is geodesic convexity in the Fisher-Rao metric on the positive definite matrices. We also provide numerical evidence that a simple regularizer can improve performance in the undersampled regime.


To join this PhD seminar on Zoom, please go to https://zoom.us/j/91652924166?pwd=bThMZHdRNU9iVXIvcGczbEhWazhpdz09.