Please note: This PhD seminar will take place online.
Zhiying Jiang, PhD candidate
David R. Cheriton School of Computer Science
Supervisor: Professor Jimmy Lin
Deep neural networks (DNNs) are often used for text classification due to their high accuracy. However, DNNs can be computationally intensive, requiring millions of parameters and large amounts of labeled data, which can make them expensive to use, to optimize, and to transfer to out-of-distributed (OOD) cases in practice.
In this paper, we propose a non-parametric alternative to DNNs that’s easy, light-weight, and universal in text classification: a combination of a simple compressor like gzip with a $k$-nearest-neighbor classifier. Without any training parameters, our method achieves results that are competitive with non-pretrained deep learning methods on six in-distributed datasets. It even outperforms BERT on all five OOD datasets, including four low-resource languages. Our method also excels in few-shot settings where labeled data are too scarce for DNNs to achieve a satisfying accuracy.