University of Waterloo COVID-19 update

The University of Waterloo is constantly updating its most frequently asked questions.

Questions about buildings and services? Please visit the list of modified services.

Please note: The University of Waterloo is closed for all events until further notice.

Cheriton School of Computer Science researchers find that people are too trusting of virtual assistants like Alexa and Siri

Thursday, January 2, 2020

Attributing lifelike qualities to conversational assistants could cause people to reveal more personal information to the companies that develop the assistants than they otherwise would. 

Researchers at Waterloo’s Cheriton School of Computer Science found that people tend to increase their sharing with online conversational agents, such as Amazon’s Alexa, Google Assistant and Apple’s Siri, because of a tendency to anthropomorphize — to assign personalities and physical features such as age, facial expressions and hairstyles to the virtual assistants.

“People are anthropomorphizing these conversation agents, which could result in them revealing information to the companies behind these agents that they otherwise wouldn’t,” said Edward Lank, a professor in Waterloo’s David R. Cheriton School of Computer Science. “These agents are data-gathering tools that companies are using to sell us stuff. 

“People need to reflect a little to see if they are formulating impressions of these agents rather than seeing them as just a piece of technology and trusting them in ways based on these impressions.” 

photo of Anastasia Kuzminykh, Jenny Sun, Niv Govindaraju, Jeff Avery, Ed Lank

Study researchers (from top left): PhD candidate Anastasia Kuzminykh, Jenny Sun (now a Software Engineer at Instagram), Niv Govindaraju (now an Associate Product Manager at Google), Computer Science Lecturer Jeff Avery, and Professor Edward Lank.

In undertaking the study, the researchers had 10 men and 10 women interact with three conversational agents — Alexa, Google Assistant and Siri. The researchers then interviewed the 20 participants to determine their perception of the agents’ personalities and what they would look like before finally asking each participant to create an avatar for each agent. 

The study’s combined results revealed that Siri’s sentiment is predominantly described as disingenuous and cunning, while Alexa is genuine and caring. The participants commonly described Alexa’s individuality as neutral and ordinary, while participants considered the individuality of Google, and Siri especially, to be more defined and pronounced.

In describing the agents visually, the participants perceived Alexa to be of average height or slightly shorter, older than the other agents, and wearing casual or business-casual clothes of dark or neutral colours. Her hair tended to be seen as darker, wavy, and worn down.

image of conversational assistants

How the participants generally visualized the three conversational assistants: Amazons Alexa (left), Google Assistant (centre) and Apples Siri (right)

The volunteers tended to perceive Google as being average height or taller, wearing either casual clothes with a focus on tech culture (e.g., hoodies), or business-formal clothes, both of dark or neutral colours. They tended to perceive Google’s hair as lighter in colour (blond, brunette) and as either long and straight, worn down or worn up (bun, ponytail), and they specifically associated Google with higher professionalism. 

Siri was commonly described as being of average height, younger than the other agents, and rarely wearing glasses, wearing either casual but fashionable clothes (V-necks, tank tops, heels) or strictly business-formal style, of either dark or particularly bright colours, especially red. The participants described Siri’s hair as short or as long straight hair worn down, either blond or black.

“This is a window into the way of thinking and, unfortunately, there are a lot of biases,” said Anastasia Kuzminykh, a PhD candidate at the Cheriton School of Computer Science. “How an agent is perceived impacts how it’s accepted and how people interact with it, how much people trust it, how much people talk to it, and the way people talk to it.”

This research in the news

The more personable a virtual assistant seems, the more data you'll share: study, by Nicole Bogart,, Friday, January 3, 2020

People are trusting virtual assistants way too much,Technology.orgFriday, January 3, 2020

Trusting virtual assistants could make us more revealing, UW study suggests, by Christine Clark,, Friday, January 3, 2020

The study, Genie In The Bottle: Anthropomorphized Perceptions Of Conversational Agents, will be presented at the ACM CHI Conference on Human Factors in Computing Systems, to be held in Honolulu, USA between April 25–30, 2020. The article will appear in the Proceedings of the CHI Conference on Human Factors in Computing Systems.

  1. 2020 (12)
    1. March (3)
    2. February (3)
    3. January (6)
  2. 2019 (90)
    1. December (3)
    2. November (8)
    3. October (6)
    4. September (9)
    5. August (7)
    6. July (8)
    7. June (10)
    8. May (5)
    9. April (11)
    10. March (6)
    11. February (10)
    12. January (7)
  3. 2018 (82)
    1. December (6)
    2. November (8)
    3. October (8)
    4. September (12)
    5. August (4)
    6. July (8)
    7. June (8)
    8. May (5)
    9. April (10)
    10. March (6)
    11. February (2)
    12. January (5)
  4. 2017 (50)
  5. 2016 (27)
  6. 2015 (41)
  7. 2014 (32)
  8. 2013 (46)
  9. 2012 (17)
  10. 2011 (20)