Master’s Thesis Presentation • Artificial Intelligence — Improved Artificial Neural Network Models for Predicting Hourly Water Consumption
Steven Wang, Master’s candidate
David R. Cheriton School of Computer Science
Steven Wang, Master’s candidate
David R. Cheriton School of Computer Science
Elaheh Jalalpour, Master’s candidate
David R. Cheriton School of Computer Science
Md Faizul Bari, PhD candidate
David R. Cheriton School of Computer Science
Ifaz Kabir, Master’s candidate
David R. Cheriton School of Computer Science
Ahmed Khan, Master’s candidate
David R. Cheriton School of Computer Science
Neurobiologically-plausible learning algorithms for recurrent neural networks that can perform supervised learning are a neglected area of study. Equilibrium propagation is a recent synthesis of several ideas in biological and artificial neural network research that uses a continuous-time, energy-based neural model with a local learning rule. However, despite dealing with recurrent networks, equilibrium propagation has only been applied to discriminative categorization tasks.
Omar Zia Khan, Senior Applied Scientist
Microsoft
Nicole McNabb, Master’s candidate
David R. Cheriton School of Computer Science
Hamid Tizhoosh, SDE
University of Waterloo
The history of artificial intelligence (AI) contains several ebbs and flows and is marked by many colorful personalities. We review major milestones in the development of machine learning, starting from principal component analysis to deep networks, and point to a multitude of pivotal developments that have strongly contributed to drawing the historical path of AI.
Daniel Recoskie, PhD candidate
David R. Cheriton School of Computer Science
Adam Molnar, Deakin University