Master’s Thesis Presentation • Machine Learning • Simple Yet Effective Pseudo Relevance Feedback with Rocchio’s Technique and Text ClassificationExport this event to calendar

Friday, August 12, 2022 — 10:00 AM to 11:00 AM EDT

Please note: This master’s thesis presentation will take place online.

Yuqi Liu, Master’s candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Jimmy Lin

With the continuous growth of the Internet and the availability of large-scale collections, assisting users in locating the information they need becomes a necessity. Generally, an information retrieval system will process an input query and provide a list of ranked results. However, this process could be challenging due to the “vocabulary mismatch” issue between input queries and passages. A well-known technique to address this issue is called “query expansion”, which reformulates the given query by selecting and adding more relevant terms. Relevance feedback, as a form of query expansion, collects users’ opinions on candidate passages and expands query terms from relevant ones. Pseudo relevance feedback assumes that the top documents in initial retrieval are relevant and rebuilds queries without any user interactions.

In this thesis, we will discuss two implementations of pseudo relevance feedback: decades old Rocchio’s Technique and more recent text classification. As the reader might notice, both techniques are not “novel” anymore, e.g., the emergence of Rocchio can even be dated back to the 1960s. They are both proposed and studied before the neural age, where texts are still mostly stored as bag-of-words representations. Today, transformers have been shown to advance information retrieval, and searching with transformer-based dense representations outperforms traditional bag-of-words searching on many challenging and complex ranking tasks. This motivates us to ask the following three research questions:

  • RQ1: Given strong baselines, large labelled datasets, and the emergence of transformers today, does pseudo relevance feedback with Rocchio’s Technique still perform effectively with both sparse and dense representations?
  • RQ2: Given strong baselines, large labelled datasets, and the emergence of transformers today, does pseudo relevance feedback via text classification still perform effectively with both sparse and dense representations?
  • RQ3: Does applying pseudo relevance feedback with text classification on top of Rocchio’s Technique results in further improvements?

To answer RQ1, we have implemented Rocchio’s Technique with sparse representations based on the Anserini and Pyserini toolkits. Building in a previous implementation of Rocchio’s Technique with dense representations in the Pyserini toolkit, we can easily evaluate and compare the impact of Rocchio’s Technique on effectiveness with both sparse and dense representations. By applying Rocchio’s Technique to MS MARCO Passage and Document TREC Deep Learning topics, we can achieve about a 0.03-0.04 increase in average precision. It’s no surprise that Rocchio’s Technique outperforms the BM25 baseline, but it’s impressive to find that it is competitive or even superior to RM3, a more common strong baseline, under most circumstances. Hence, we propose to switch to Rocchio’s Technique as a more robust and general baseline in future studies. To our knowledge, pseudo relevance feedback via text classification using both positive and negative labels is not well-studied before our work.

To answer RQ2, we have verified the effectiveness of pseudo relevance feedback via text classification with both sparse and dense representations. Three classifiers (LR, SVM, KNN) are trained, and all enhance effectiveness. We also observe that pseudo relevance feedback via text classification with dense representations yields greater improvement than sparse ones. However, when we compare text classification to Rocchio’s Technique, we find that Rocchio’s Technique is superior to pseudo relevance feedback via text classification under all circumstances.

In RQ3, the success of pseudo relevance feedback via text classification on BM25 + RM3 across four newswire collections in our previous paper motivates us to study the impact of pseudo relevance feedback via text classification on top of another query expansion result, Rocchio’s Technique. However, unlike RM3, we could not observe much difference in the two evaluation metrics after applying pseudo relevance feedback via text classification on top of Rocchio’s Technique. This work aims to explore some simple yet effective techniques which might be ignored in light of deep learning transformers. Instead of pursuing “more”, we are aiming to find out something “less”. We demonstrate the robustness and effectiveness of some “out-of-date” methods in the age of neural networks.


To join this master’s thesis presentation on Zoom, please go to https://zoom.us/j/91022612792.

Location 
Online master’s thesis presentation
200 University Avenue West

Waterloo, ON N2L 3G1
Canada
Event tags 

S M T W T F S
26
27
28
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
1
  1. 2024 (119)
    1. May (5)
    2. April (37)
    3. March (27)
    4. February (25)
    5. January (25)
  2. 2023 (296)
    1. December (20)
    2. November (28)
    3. October (15)
    4. September (25)
    5. August (30)
    6. July (30)
    7. June (22)
    8. May (23)
    9. April (32)
    10. March (31)
    11. February (18)
    12. January (22)
  3. 2022 (245)
  4. 2021 (210)
  5. 2020 (217)
  6. 2019 (255)
  7. 2018 (217)
  8. 2017 (36)
  9. 2016 (21)
  10. 2015 (36)
  11. 2014 (33)
  12. 2013 (23)
  13. 2012 (4)
  14. 2011 (1)
  15. 2010 (1)
  16. 2009 (1)
  17. 2008 (1)