Please note: This PhD defence will take place in DC 3317 and online.
Thomas Humphries, PhD candidate
David R. Cheriton School of Computer Science
Supervisor: Professor Florian Kerschbaum
There is no one-size-fits-all solution to preserving privacy in data science. While insights derived from sensitive data can benefit society, privacy is typically at odds with utility, performance, usability, or some combination of these objectives. Furthermore, each system differs in its definition of these objectives and the way they interact with one another. If the cost in any one of these objectives is too high, the system will not be deployed, or worse, deployed with a weakened privacy guarantee, exposing users to potential harm.
In my work, I address this challenge from multiple angles. First, through strategic algorithm design, my work creates private systems with improved trade-offs, enabling their deployment. This includes a more efficient protocol for the secure inference of deep machine learning models and a novel construction for aggregating key-value data within the local trust model. Second, my work audits private systems to show the privacy risks associated with misleading privacy claims. In particular, through a privacy audit of machine learning, we highlight a difference between expectation and reality in privacy protections.
To attend this PhD defence in person, please go to DC 3317. You can also attend virtually on BigBlueButton.