Master’s Thesis Presentation • Artificial Intelligence — Disentangled Syntax and Semantics for Stylized Text Generation

Wednesday, September 9, 2020 10:00 am - 10:00 am EDT (GMT -04:00)

Please note: This master’s thesis presentation will be given online.

Yao Lu, Master’s candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Olga Vechtomova

Neural network based methods are widely used in text generation. The end-to-end training of neural networks directly optimizes the text generation pipeline has been proved powerful in various tasks, including machine translation and summarization. However, the end-to-end neural network training makes it difficult to control the generation by partially changing the text properties (semantics, writing style, length, etc.). This makes text generation less flexible and controllable.

In this work, we study how to control the syntactic structure of text generation without changing the semantics. We proposed a variational autoencoder based model with disentangled the latent space. Our framework introduces various multitask learning and adversarial learning objectives as the regularization towards the syntax and content latent space, separately.

The syntax latent space is required to parse a constituency tree while it cannot predict the bag-of-word feature of the given sentence. Likewise, the content latent space is required to predict the bag-of-word feature while contains no information for the parse tree. Experiment results show that our model (TA-VAE) outperforms previous work. The quantitative and qualitative studies indicate that the TA-VAE model has a high-quality disentanglement of latent space for syntax controlled text generation.


To join this master’s thesis presentation on Zoom, please go to https://us02web.zoom.us/j/87094354491?pwd=dWl6Z1E3VVlRQVFhOUZrd05QVDFPZz09.