PhD Seminar • Data Systems • Efficient Dense Representation Learning for Information RetrievalExport this event to calendar

Wednesday, January 24, 2024 — 12:30 PM to 1:30 PM EST

Please note: This PhD seminar will take place in DC 1304.

Sheng-Chieh (Jack) Lin, PhD candidate
David R. Cheriton School of Computer Science

Supervisor: Professor Jimmy Lin

Contrastive learning is a commonly used technique to train an effective neural retrieval model; however, it requires much computation resources (i.e., multiple GPUs or TPUs).

In this talk, I will present our two previous works on how to efficiently train neural models for dense retrieval for web search. (1) In-batch negatives for knowledge distillation with tightly-coupled teachers for dense retrieval (TCT-ColBERT); (2) Contextualized query embeddings for conversational search (CQE). First, we present how we use ColBERT as a teacher to efficiently train a single-vector dense retrieval model, which reaches competitive performance with limited training resource. Second, we discuss how to fast adapt the fine-tuned dense retriever to conversational search without using human relevance labels. All these two works advanced state-of-the-art retrieval effectiveness upon publication and were done in free Colab with single TPUv2.

Location 
DC - William G. Davis Computer Research Centre
DC 1304
200 University Avenue West

Waterloo, ON N2L 3G1
Canada
Event tags 

S M T W T F S
28
29
30
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
1
  1. 2024 (134)
    1. June (2)
    2. May (14)
    3. April (41)
    4. March (27)
    5. February (25)
    6. January (25)
  2. 2023 (296)
    1. December (20)
    2. November (28)
    3. October (15)
    4. September (25)
    5. August (30)
    6. July (30)
    7. June (22)
    8. May (23)
    9. April (32)
    10. March (31)
    11. February (18)
    12. January (22)
  3. 2022 (245)
  4. 2021 (210)
  5. 2020 (217)
  6. 2019 (255)
  7. 2018 (217)
  8. 2017 (36)
  9. 2016 (21)
  10. 2015 (36)
  11. 2014 (33)
  12. 2013 (23)
  13. 2012 (4)
  14. 2011 (1)
  15. 2010 (1)
  16. 2009 (1)
  17. 2008 (1)