Nicholas Vadivelu receives 2021 Jessie W.H. Zou Memorial Award

Tuesday, May 25, 2021

Nicholas Vadivelu, an undergraduate student majoring in computer science and statistics, has received the 2021 Jessie W.H. Zou Memorial Award for Excellence in Undergraduate Research. Established in 2012, this prestigious annual award recognizes excellence in research conducted by an undergraduate student in the Faculty of Mathematics. 

In Nicholas’s case, his excellence in research spans several projects, most notably his contributions to improving the accuracy of machine learning algorithms while a research intern at Uber Advanced Technologies Group and increasing the speed of machine-learning algorithms while an undergraduate research assistant with Professor Gautam Kamath.

“Congratulations to Nicholas on receiving a 2021 Jessie W.H. Zou Memorial Award,” said Raouf Boutaba, Professor and Director of the Cheriton School of Computer Science. “He is an exceptionally bright and motivated undergraduate student who not only conducted impressive research with real-world implications, but also disseminated his findings to industry and academia at international conferences.”

Uber ATG • Learned Positional Error Correction System

During his internship at Uber ATG, Nicholas was an integral member of a research team that developed an end-to-end learned neural reasoning framework for self-driving vehicles. 

Self-driving vehicles learn from shared sensory information, which makes them safer to operate, but sharing information across nearby self-driving vehicles can also make them susceptible to erroneous messages they might receive. During his internship at Uber ATG on a project conducted with University of Toronto PhD candidate Mengye Ren and Professor Raquel Urtasun, Nicholas developed a novel end-to-end learned neural reasoning framework that learns to estimate localization errors, communicate those errors to other vehicles, and reach a consensus about the errors. This framework has allowed a state-of-the-art collaborative self-driving neural network called V2VNet to perform detection and forecasting with almost no drop in performance in the presence of noise levels much greater than are typically seen in real self-driving vehicle systems. 

“It was to my surprise to learn how deep Nicholas had gone into AI and machine learning before landing at his internship position at Uber,” Mengye Ren said. “He was a third-year undergraduate student, but he knew no less than a regular PhD student in the field. He immediately ramped up with our core self-driving technologies developed at Uber, and started experimenting with new algorithms. I asked him to implement a learnable module based on probabilistic graphical models. Although he was not exposed to that beforehand, he was able to derive the formulation of the new model quickly after learning the concepts. His top-grade coding and math skills had made our collaboration marching towards our goal at lightning speed.”

Nicholas’s research was published in a paper titled “Learning to Communicate and Correct Pose Errors,” and presented at CoRL 2020, the 4th Annual Conference on Robot Learning.

Cheriton School of Computer Science • Computationally Efficient Differentially Private Stochastic Gradient Descent

Nicholas also contributed significantly to differentially private machine learning, specifically through introducing frameworks that dramatically increase the speed at which Differentially Private Stochastic Gradient Descent is executed.

Differential privacy provides a theoretical framework to limit what can be deduced about data given a model trained on that data. Privacy is critical when dealing with sensitive user data — an individual’s personal and confidential information that ranges from financial transactions to medical records. Differentially private stochastic gradient descent, or DPSGD, is the workhorse algorithm for training differentially private models, ones that operate on a dataset but do not allow any individual data point within them to be identified. DPSGD unfortunately is slow and memory hungry, which limits its use in practice. 

With MMath student Pranav Subramani and Professor Kamath, Nicholas co-led a project on accelerating DPSGD. His key innovation was explicit vectorization and just-in-time compilation to create a state-of-the-art implementation that was up to 50 times faster than the best alternatives. Nicholas then integrated this implementation in two open-source libraries to make privacy machine learning broadly accessible to both researchers and engineers. 

“I am incredibly impressed by Nic’s abilities, and his contributions to our joint project have been absolutely invaluable,” Professor Kamath said. “A common problem in differentially private machine learning and statistics is the fact that DPSGD is quite slow compared with the non-private version. Nic’s remarkably mature knowledge and experience with machine learning technologies helped lead us to a new framework, JAX, which would serve as a key part of the solution. He was able to suggest many insightful experiments to run. He simultaneously had a very deep understanding of how things worked and was able to explain them to me as well. Nic is a co-first author of our paper titled Enabling Fast Differentially Private SGD via Just-in-Time Compilation and Vectorization, which appeared at the Workshop on Privacy-Preserving Machine Learning at NeurIPS ’20. Since this paper appeared online, several differential privacy researchers have commented how useful and important this result is. I am confident that this finding will make a huge impact in the practice of private machine learning over the next several years.”

In addition to these projects, Nicholas has conducted research on machine learning–driven software analysis with Professor Lin Tan of the Department of Electrical and Computer Engineering, practical second-order optimization algorithms for neural networks with Professor Pascal Poupart of the Cheriton School of Computer Science, and probabilistic modelling with Professor Martin Lysy of the Department of Statistics and Actuarial Science. He has also improved the neural network optimization algorithm library K-FAC at Google Brain, accelerated BERT inference at NVIDIA TensorRT, and developed data-driven models to identify fraud at John Hancock Financial.


Jessie W.H. Zou Memorial Award for Excellence in Undergraduate Research

Cheriton School of Computer Science Professor Ming Li and his family established the Jessie W.H. Zou Memorial Award for Excellence in Undergraduate Research in 2012 in honour of his late wife, Jessie Wenhui Zou.

Jessie was born on August 27, 1968 in Wuhan, China. She received her PhD in physics from Wuhan University when she was 26 years old. She loved the field of finance and continued to pursue her studies and graduated from the University of Waterloo’s statistical finance master’s program in 2000. Upon graduation, she worked at the Bank of Montreal, specializing in risk control management.

The Jessie W.H. Zou Memorial Award for Excellence in Undergraduate Research is valued at $1,000 and is presented annually to an undergraduate student enrolled in his or her final year of any program within the Faculty of Mathematics. Students must have demonstrated excellence in research and have been nominated by a faculty member who has supervised that research. The awardee is chosen by the Faculty of Mathematics Research Advisory Committee, chaired by the Associate Dean, Research.

Past awardees