Meet Wenhu Chen, a professor who studies natural language processing, deep learning, knowledge representation and reasoning

Monday, October 24, 2022

Wenhu Chen joined the Cheriton School of Computer Science in fall 2022 as an Assistant Professor. Before joining Waterloo as faculty, he was a Research Scientist at Google Research.

His research spans natural language processing, deep learning, and knowledge representation and reasoning. Specifically, he aims to develop models that can ground and reason over external world knowledge to understand human language and to communicate with humans. He is also interested in multi-modal problems like visual question answering and captioning.

The following is a lightly edited transcript of a Q and A interview.

photo of Professor Wenhu Chen in the Davis Centre

Professor Wenhu Chen in the Davis Centre. Photo credit: Xinyi Wang

Tell us a bit about yourself.

I grew up in the southeast part of China and completed my undergrad degree in central China at Huazhong University of Science and Technology. After I completed my undergrad degree, I moved to Aachen, Germany to pursue my master’s degree, then to the United States in 2017 to do a PhD at the University of California, Santa Barbara under the guidance of William Wang, my main advisor. After my PhD, I worked at Google Research then joined the Cheriton School of Computer Science as faculty.

When did you become interested in natural language processing and deep learning?

When I was pursuing my master’s degree, majoring in electrical engineering, I found the electrical engineering courses less interesting than the computer science ones I was taking, especially the courses on machine translation, computer vision and speech recognition. I took a few courses in a field related to natural language processing and I found them exciting, and more interesting than the courses in electrical engineering that dealt with topics such as Fourier transformations of electromagnetic waves, or how should we deploy the antenna for radio communications.

In contrast, with deep learning I could write some code that could recognize the human voice and transcribe it into text. Deep learning and machine learning were new to me, and I was fascinated by how powerful and practical they are and what they can accomplish when addressing real-world issues.

What attracted you to the Cheriton School of Computer Science?

I knew about the University of Waterloo since I was an undergrad. One of my classmates, a top student who had the highest GPA in my class, went to Waterloo to pursue a master’s degree in math after she graduated from her undergrad degree. I had a lot of conversations with her and kept in touch over the years. She told me lots of good things about Waterloo, the strength of the students here, and how large and strong the School of Computer Science is.

When I was looking for employment, I found out about opportunities at Waterloo through some tweets that Gautam Kamath, a faculty member here, had sent about the School of Computer Science. I saw this as a great opportunity, especially because during the early days of the pandemic many other universities had frozen hiring. In contrast, the School of Computer Science had several openings for faculty. I thought this was a great opportunity that I needed to grasp.

I also had a great impression of the School and its faculty during my interview. The School of Computer Science has many faculty who conduct research in similar and complementary areas. I also knew of some of the people here such as Jimmy Lin through his work on information retrieval, NLP and question answering systems. I also knew about several faculty who work in artificial intelligence — Pascal Poupart, Yaoliang Yu, Gautam Kamath, Kimon Fountoulakis and Shai Ben-David. We have a lot of research interests in common, especially in fundamental areas of NLP research. I thought, if I were to join the School, I’d have many opportunities to collaborate with an excellent team of computer science researchers.

I knew Waterloo was my top choice, and after my interview I chatted with my advisors at the University of California, Santa Barbara, both of whom encouraged me to consider Waterloo very seriously. When I got the offer, I accepted almost immediately.

Tell us a bit about your research.

My research is mostly in natural language processing, deep learning, and areas related to knowledge such as knowledge representation and reasoning.

I design AI models that can ground on real-world knowledge to make predictions. More specifically, I build models that have an explicit component that captures knowledge from the web or other data sources and reasons over this knowledge before making predictions. And once the model is making predictions, it can provide the rationale or the evidence it is based on instead of just predicting the output stream without providing any verifiable evidence. That’s the end goal. Having a rationale increases the trustworthiness and the reliability of the models.

One basic example is open-domain question answering. In a Google search you can type in a question, for example, what’s the exchange rate of US dollars to Canadian dollars or when was Barack Obama born. Instead of providing relevant pages that contain information that answers the question, we want a system that can summarize information into a phrase or a couple of sentences.

At this point, my research is focused on two things — how to retrieve such information from a sea of webpages and how to adapt to changes over time. For example, if you ask a question that’s time sensitive, for example, who is the President of the United States, that information is not static. Or if you ask, which team is the NBA champion. The answer this year may be different from next year.

This information is not static, so the model needs to adapt to changes over time instead of memorizing things that become stale or outdated. Those are the two things I’m studying now.

Do you see opportunities for collaborative research at the School of Computer Science?

Yes, definitely. In fact, I already have a PhD student I am co-advising with Jimmy Lin. Junior faculty can’t solely advise a doctoral student at first, so we co-advise with a more senior faculty member. In this case, I recruited the student who is co-advised by Jimmy Lin and me. We’re working on how to retrieve images from the web, basically multimodal retrieval.

I’m also applying for a grant with Pascal Poupart with a Canadian insurance company. We’re proposing to build models that use claims data to predict how much coverage to provide based on the customer’s history. And with Gautam Kamath, we’re thinking about building NLP models that are privacy preserving, specifically, how we can train the NLP models without leaking any private data.

What do you see as your most significant contribution or work to date?

This would be work I did during my PhD. When designing reasoning models over knowledge, people were considering only two knowledge forms at the time. People tended to consider only text knowledge — passages of text or documents. Or people considered reasoning over knowledge graphs, a knowledge base where some object has a relation to another object. These were the two overarching knowledge forms under consideration.

However, a ubiquitous knowledge form that people did not consider is tables. When you browse the web, you see lots of tables that summarize information as values or as phrases or sentences. I found a gap that I could explore and work on. I’ve been devoted to designing different table reasoning NLP models — for example, how do we do question answering and fact verification over tables. I collected several data sets manually using crowd sourcing workers to annotate things like question-and-answer pairs, and have published a sequence of papers on this. They have been cited more than 500 times collectively and have been adopted widely by the community.

Who has inspired you most in your academic life?

The first person who comes to my mind is my main PhD advisor. I am very thankful for William Wang’s guidance and support. He identified me as a potential PhD student even though when I was applying to PhD programs, I didn’t have much research experience and my GPA from my undergrad days wasn’t the highest.

I wrote a letter to William, telling him how I’m willing to devote myself to learning about machine learning and natural language processing. Even though I wasn’t a top-tier student, he was moved by my letter. He spent a lot of time advising me during my PhD and helped find financial support. He also gave me a lot of great advice about career choices, whether I should pursue a career in industry or academia. He was also the one who urged me to join the University of Waterloo.

What do you do in your spare time?

I used to play a lot of basketball daily. I played in some local games but not at a competitive level. I also enjoy European soccer and watch a lot of games — the Championship League and World Cup. I’m also a big fan of Bayern Munich, a German football club.

  1. 2024 (28)
    1. April (6)
    2. March (13)
    3. February (1)
    4. January (8)
  2. 2023 (70)
    1. December (6)
    2. November (7)
    3. October (7)
    4. September (2)
    5. August (3)
    6. July (7)
    7. June (8)
    8. May (9)
    9. April (6)
    10. March (7)
    11. February (4)
    12. January (4)
  3. 2022 (63)
    1. December (2)
    2. November (7)
    3. October (6)
    4. September (6)
    5. August (1)
    6. July (3)
    7. June (7)
    8. May (8)
    9. April (7)
    10. March (6)
    11. February (6)
    12. January (4)
  4. 2021 (64)
  5. 2020 (73)
  6. 2019 (90)
  7. 2018 (82)
  8. 2017 (51)
  9. 2016 (27)
  10. 2015 (41)
  11. 2014 (32)
  12. 2013 (46)
  13. 2012 (17)
  14. 2011 (20)