Seminar

Tuesday, April 6, 2021 12:00 pm - 12:00 pm EDT (GMT -04:00)

Seminar • Machine Learning — Towards Unsupervised 3D Deep Learning

Please note: This seminar will be given online.

Andrea Tagliasacchi, Research Scientist
Google Brain

It is not uncommon to think of computer graphics and computer vision as loosely disconnected disciplines; the former dealing with the synthesis of visual phenomena and the latter with analysis. However, recent advances in deep learning have blurred the boundary between the two. As a consequence, the research path to develop algorithms that effectively interpret the 3D scene “behind” an image has never seemed so well within reach.

Wednesday, March 31, 2021 12:00 pm - 12:00 pm EDT (GMT -04:00)

PhD Seminar • Data Systems — Evaluating Complex Queries on Streaming Graphs

Please note: This PhD seminar will be given online.

Anil Pacaci, PhD candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Tamer Özsu

Modern applications in many domains now operate on high-speed streaming graphs that continuously evolve at high rates. Efficient querying of these streaming graphs is a crucial task for applications that monitor complex patterns and relationships. 

Thursday, March 25, 2021 12:00 pm - 12:00 pm EDT (GMT -04:00)

Seminar • Systems and Networking — Resource-Efficient Execution for Deep Learning

Please note: This seminar will be given online.

Deepak Narayanan, Department of Computer Science
Stanford University

Deep Learning models have enabled state-of-the-art results across a broad range of applications; however, training these models is extremely time- and resource-intensive, taking weeks on clusters with thousands of expensive accelerators in the extreme case.

Please note: This PhD seminar will be given online.

Akshay Ramachandran, PhD candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Lap Chi Lau

The matrix normal model, the family of Gaussian matrix-variate distributions whose covariance matrix is the Kronecker product of two lower dimensional factors, is frequently used to model matrix-variate data. The tensor normal model generalizes this family to Kronecker products of three or more factors.