PhD Defence • Programming Languages • Design and Implementation of Probabilistic Programming Languages for Sound and Scalable Inference

Tuesday, June 23, 2026 10:00 am - 1:00 pm EDT (GMT -04:00)

Please note: This PhD defence will take place in DC 2314 and online.

Jianlin Li, PhD candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Yizhou Zhang

Probabilistic programming languages (PPLs) provide a powerful framework for specifying and solving complex Bayesian inference problems using general-purpose programming constructs. However, the same linguistic expressiveness that makes PPLs appealing also introduces challenges for performing sound (i.e., correct semantics for inference) and scalable (i.e., efficient inference even in the presence of recursion or high-dimensional models) inference.

This thesis explores the design and implementation of PPLs by developing novel compilation strategies tailored to different inference methods and compilation artifacts. This thesis centers on four major systems.

Fidelio addresses deep amortized inference by generating neural guide programs via a type-preserving and dependence-aware translation, ensuring soundness with respect to absolute continuity.

Mappl targets exact inference via variable elimination, compiling probabilistic programs with bounded recursion to factor functions guided by an information-flow type system.

Geni enables scalable exact inference for discrete models by compiling to generating functions, offering a mathematically principled and efficient representation.

Tessa reframes probabilistic model checking for step-bounded reachability as tensor computations, enabling sound compilation into JAX programs and accelerator-backed execution for substantial speedups.

The central thesis is that the compiler perspective leads to sound, scalable ways to automate probabilistic inference for a rich class of probabilistic models.


To attend this PhD defence in person, please go to DC 2314. You can also attend virtually on Zoom.