University of Waterloo COVID-19 update

Please see the University of Waterloo’s frequently asked questions for information about COVID-19 and how it has affected university operations.

Please visit the list of modified services if you have questions about university services.

Although the University of Waterloo is closed for in-person events until further notice, many virtual events and presentations of interest to computer scientists are taking place each week at the Cheriton School of Computer Science. Please check out what our students and faculty are doing.

PhD candidate Anastasia Kuzminykh looks into why it’s so hard to pay attention during video conferences

Wednesday, April 22, 2020

As global COVID-19 lockdowns have us sitting through days of video conferences, it becomes clear that paying attention online is hard work.

In two new papers, researchers from the Cheriton School of Computer Science and Microsoft Research explore people’s attentiveness in video conferences to understand how to make the online meetings more comfortable and effective.

photo of Anastasia KuzminykhKey features suggested to improve people’s attention in online meetings include having the ability to zoom in, notifications of actions that have occurred between meeting participants, the use of split system views and the ability to track people’s gazes.

“What we informally call ‘paying attention’ in video conferences is really quite complex,” said Anastasia Kuzminykh, a PhD candidate in Waterloo’s Cheriton School of Computer Science. “Our studies show that we need to pay more attention to attention itself.”

The researchers conducted interviews with experienced meeting participants to understand how visual attention in meetings was related to levels of engagement. They then carried out an experimental study in which participants watched muted videos of real work meetings and tried to narrate the attention processes they observed. Their goal was to develop a model of how we use visual attention in meetings.

Their model divides attention into three categories:

  • Action as direction — where people look
  • Attention as action — looking with intent
  • Attention as state — the sense of engagement with the meeting’s purpose

“We identified attention as action as the category that might be best suited to developing new features because AI could augment human vision in meetings — helping us gather, signal and follow attention,” said Anastasia Kuzminykh. “This is especially important when you’re the one remote person talking to a room full of people at the other end. We need help seeing into that room.

“Key features might include zooming where everyone is looking, notifications of attention between participants, and dynamic views rather than static ‘Brady Bunch’ style walls of video.”

A second paper explores low engagement in meetings. Participants use technology features — such as when they turn video on or off — to socially signal to others about attention expectations. Video conferencing needs more support for a range of levels of attention in meetings, rather than assuming that everyone should pay full attention at all times.

The two papers — Classification of Functional Attention in Video Meetings, and Low Engagement As a Deliberate Practice of Remote Participants in Video Meetings — were both authored by Anastasia Kuzminykh and her colleague, Sean Rintel, a Senior Researcher at Microsoft Research Cambridge, and accepted for presentation at the CHI 2020 conference.

  1. 2020 (31)
    1. May (6)
    2. April (13)
    3. March (3)
    4. February (3)
    5. January (6)
  2. 2019 (90)
    1. December (3)
    2. November (8)
    3. October (6)
    4. September (9)
    5. August (7)
    6. July (8)
    7. June (10)
    8. May (5)
    9. April (11)
    10. March (6)
    11. February (10)
    12. January (7)
  3. 2018 (82)
    1. December (6)
    2. November (8)
    3. October (8)
    4. September (12)
    5. August (4)
    6. July (8)
    7. June (8)
    8. May (5)
    9. April (10)
    10. March (6)
    11. February (2)
    12. January (5)
  4. 2017 (50)
  5. 2016 (27)
  6. 2015 (41)
  7. 2014 (32)
  8. 2013 (46)
  9. 2012 (17)
  10. 2011 (20)