Please note: This PhD seminar will be given online.
Andreas
Stöckel, PhD
candidate
David
R.
Cheriton
School
of
Computer
Science
Supervisor: Professor Chris Eliasmith
The “Legendre Memory Unit” (LMU) proposed by Voelker, Kajić, and Eliasmith (2019) is a recurrent neural network that substantially outperforms classic architectures such as “reservoir computing” and long short-term memories (LSTMs). At its core, the LMU is based on a linear time-invariant (LTI) system that approximates the windowed Legendre polynomials as its impulse response. We discuss a novel and surprisingly simple derivation of this and similar LTI systems that generate windowed temporal bases. Using these methods, we are able to construct LTI systems that offer a slightly improved memory capacity over the LMU LTI. Using these insights, we can construct feed-forward networks based on sliding-window spectra as an alternative to a feed-forward variant of the LMU with different trade-offs in terms of time- and memory-complexity.
To join this PhD seminar on Zoom, please go to https://uwaterloo.zoom.us/j/95089656133?pwd=bjJTSytQU3plUDFWZmxaMXJMTXdFUT09.