Abdullah
Rashwan,
PhD
candidate
David
R.
Cheriton
School
of
Computer
Science
Sum-product networks have recently emerged as an attractive representation due to their dual view as a special type of deep neural network with clear semantics and a special type of probabilistic graphical model for which inference is always tractable. Those properties follow from some conditions (i.e., completeness and decomposability) that must be respected by the structure of the network.
As a result, it is not easy to specify a valid sum-product network by hand and therefore structure learning techniques are typically used in practice. In this work, we present a new online structure learning technique for feed-forward and recurrent SPNs. The algorithm is demonstrated on real-world datasets with continuous features for which it is not clear what network architecture might be best, including sequence datasets of varying length.