Cheriton School of Computer Science researchers find that people are too trusting of virtual assistants like Alexa and Siri

Thursday, January 2, 2020

Attributing lifelike qualities to conversational assistants could cause people to reveal more personal information to the companies that develop the assistants than they otherwise would. 

Researchers at Waterloo’s Cheriton School of Computer Science found that people tend to increase their sharing with online conversational agents, such as Amazon’s Alexa, Google Assistant and Apple’s Siri, because of a tendency to anthropomorphize — to assign personalities and physical features such as age, facial expressions and hairstyles to the virtual assistants.

“People are anthropomorphizing these conversation agents, which could result in them revealing information to the companies behind these agents that they otherwise wouldn’t,” said Edward Lank, a professor in Waterloo’s David R. Cheriton School of Computer Science. “These agents are data-gathering tools that companies are using to sell us stuff. 

“People need to reflect a little to see if they are formulating impressions of these agents rather than seeing them as just a piece of technology and trusting them in ways based on these impressions.” 

photo of Anastasia Kuzminykh, Jenny Sun, Niv Govindaraju, Jeff Avery, Ed Lank

Study researchers (from top left): PhD candidate Anastasia Kuzminykh, Jenny Sun (now a Software Engineer at Instagram), Niv Govindaraju (now an Associate Product Manager at Google), Computer Science Lecturer Jeff Avery, and Professor Edward Lank.

In undertaking the study, the researchers had 10 men and 10 women interact with three conversational agents — Alexa, Google Assistant and Siri. The researchers then interviewed the 20 participants to determine their perception of the agents’ personalities and what they would look like before finally asking each participant to create an avatar for each agent. 

The study’s combined results revealed that Siri’s sentiment is predominantly described as disingenuous and cunning, while Alexa is genuine and caring. The participants commonly described Alexa’s individuality as neutral and ordinary, while participants considered the individuality of Google, and Siri especially, to be more defined and pronounced.

In describing the agents visually, the participants perceived Alexa to be of average height or slightly shorter, older than the other agents, and wearing casual or business-casual clothes of dark or neutral colours. Her hair tended to be seen as darker, wavy, and worn down.

image of conversational assistants

How the participants generally visualized the three conversational assistants: Amazons Alexa (left), Google Assistant (centre) and Apples Siri (right)

The volunteers tended to perceive Google as being average height or taller, wearing either casual clothes with a focus on tech culture (e.g., hoodies), or business-formal clothes, both of dark or neutral colours. They tended to perceive Google’s hair as lighter in colour (blond, brunette) and as either long and straight, worn down or worn up (bun, ponytail), and they specifically associated Google with higher professionalism. 

Siri was commonly described as being of average height, younger than the other agents, and rarely wearing glasses, wearing either casual but fashionable clothes (V-necks, tank tops, heels) or strictly business-formal style, of either dark or particularly bright colours, especially red. The participants described Siri’s hair as short or as long straight hair worn down, either blond or black.

“This is a window into the way of thinking and, unfortunately, there are a lot of biases,” said Anastasia Kuzminykh, a PhD candidate at the Cheriton School of Computer Science. “How an agent is perceived impacts how it’s accepted and how people interact with it, how much people trust it, how much people talk to it, and the way people talk to it.”

This research in the news

The more personable a virtual assistant seems, the more data you'll share: study, by Nicole Bogart, CTVNews.ca, Friday, January 3, 2020

People are trusting virtual assistants way too much,Technology.orgFriday, January 3, 2020

Trusting virtual assistants could make us more revealing, UW study suggests, by Christine Clark, KitchenerToday.com, Friday, January 3, 2020


The study, Genie In The Bottle: Anthropomorphized Perceptions Of Conversational Agents, will be presented at the ACM CHI Conference on Human Factors in Computing Systems, to be held in Honolulu, USA between April 25–30, 2020. The article will appear in the Proceedings of the CHI Conference on Human Factors in Computing Systems.

  1. 2024 (29)
    1. April (7)
    2. March (13)
    3. February (1)
    4. January (8)
  2. 2023 (70)
    1. December (6)
    2. November (7)
    3. October (7)
    4. September (2)
    5. August (3)
    6. July (7)
    7. June (8)
    8. May (9)
    9. April (6)
    10. March (7)
    11. February (4)
    12. January (4)
  3. 2022 (63)
    1. December (2)
    2. November (7)
    3. October (6)
    4. September (6)
    5. August (1)
    6. July (3)
    7. June (7)
    8. May (8)
    9. April (7)
    10. March (6)
    11. February (6)
    12. January (4)
  4. 2021 (64)
  5. 2020 (73)
  6. 2019 (90)
  7. 2018 (82)
  8. 2017 (51)
  9. 2016 (27)
  10. 2015 (41)
  11. 2014 (32)
  12. 2013 (46)
  13. 2012 (17)
  14. 2011 (20)