PhD Seminar • Artificial Intelligence • FedLog: Personalized Federated Classification with Less Communication and More Flexibility

Wednesday, October 22, 2025 2:30 pm - 3:30 pm EDT (GMT -04:00)

Please note: This PhD seminar will take place in DC 2584.

Haolin Yu, PhD candidate
David R. Cheriton School of Computer Science

SupervisorProfessor Pascal Poupart

Federated representation learning (FRL) aims to learn personalized federated models with effective feature extraction from local data. FRL algorithms that share the majority of the model parameters face significant challenges with huge communication overhead. This overhead stems from the millions of neural network parameters and slow aggregation progress of the averaging heuristic. To reduce the overhead, we propose FedLog, which shares sufficient data summaries instead of raw model parameters. The data summaries encode minimal sufficient statistics of an exponential family, and Bayesian inference is utilized for global aggregation. FedLog helps to reduce message sizes and communication frequency. We prove that the shared message is minimal and theoretically analyze the convergence rate of FedLog.
To further ensure formal privacy guarantees, we extend FedLog with the differential privacy framework. Empirical results demonstrate high learning accuracy with low communication overhead of our method.