Please note: This PhD defence will take place online.
Jesse Allister Kasian Elliott, PhD candidate
David R. Cheriton School of Computer Science
Supervisors: Professors David Jao, Éric Schost
Algebraic geometry has several real-world applications where linear algebra is insufficient, particularly when dealing with nonlinear structures. However, algorithms in algebraic geometry are often impractical due to their high complexity. Nonetheless, even with high worst-case complexity, many algorithms are efficient in practice. When performing computations over a ground field such as $\mathbb{Q}$, obtaining fast actual runtime requires an analysis of the algorithm's bit complexity and the use of modular techniques to avoid uncontrolled coefficient growth. This thesis investigates complexity in computational algebraic geometry, particularly bit-complexity questions in real algebraic geometry, and $2^a$-isogeny computation on Legendre curves.
Analyzing the bit complexity enables the development of modular extensions that manage the growth of intermediate expressions, making implementations more efficient in practice for real-world applications.
The main contributions are as follows.
- We analyze the bit complexity of an algorithm that computes one point per connected component in a real algebraic set. The analysis is of an algorithm by Safey El Din and Schost (Polar varieties and computation of one point in each connected component of a smooth real algebraic set}, ISSAC'03). This algorithm uses random changes of variables that are proven to generically ensure certain desirable geometric properties such as weak transversality and Noether position for polar varieties. The cost of the algorithm was given in an algebraic complexity model. We analyze the bit complexity and the error probability, and we provide a quantitative analysis of the genericity statements.
- The Chinese remainder theorem (CRT) is used in developing modular algorithms, controlling the growth of intermediate expressions by performing computations modulo a number of small primes, and then reconstructing the output modulo the product of the primes. Some unlucky primes produce errors by returning incorrect results or no result. We can bound their number by finding a nonzero $U \in \mathbb{Z}$ with the property that all unlucky primes divide~$U$. Without allowing for errors, CRT-based modular algorithms require all primes to be lucky, which typically results in larger primes. Error-correction techniques (introduced by Pernet and B\"ohm et al.) allow a small number of unlucky primes, therefore reducing the prime size. We provide a quantitative analysis that gives explicit sufficient conditions on the number and size of the primes to ensure a given success probability, when given height bounds on the output and the integer $U$ (from above). We also demonstrate some applications.
- We give bit size estimates for computing roadmaps in smooth bounded real hypersurfaces. Roadmaps are used to decide if two points in a real algebraic set are continuously connected by a path from one point to the other, and therefore have applications in robot motion planning by deciding if paths exist between points. To obtain our height estimates, we apply tools from intersection theory that involve the arithmetic Chow ring, developed by D’Andrea, Krick, and Sombra.
- We introduce a method for efficiently computing $2^a$-isogenies in Legendre form with applications in post-quantum cryptography. The majority of work on isogeny computation uses elliptic curves in Montgomery form; this includes the work on Supersingular Isogeny Diffie-Hellman (SIDH) by Jao, De Feo, and Plût and Supersingular Isogeny Key Encapsulation (SIKE). To the best of our knowledge at the time of writing, elliptic curves in Legendre form had not yet been explored for isogeny-based cryptography. Legendre form is an interesting family to study, having a very simple defining equation and the simplest possible representation of the $2$-torsion subgroup.