Giving human touch to Alexa or Siri can backfire

Giving human touch to Alexa or Siri can backfire
x
Highlights

An Indian American researcher-led team has found that giving human touch to chat bots like Apple Siri or Amazon Alexa may actually disappoint users.

New York, April 21: An Indian American researcher-led team has found that giving human touch to chat bots like Apple Siri or Amazon Alexa may actually disappoint users.

Just giving a chat bot human name or adding human-like features to its avatar might not be enough to win over a user if the device fails to maintain a conversational back-and-forth with that person, according to S. Shyam Sundar, Co-director of Media Effects Research Laboratory at Pennsylvania State University.

"People are pleasantly surprised when a chat bot with fewer human cues has higher interactivity," said Sundar.

"But when there are high human cues, it may set up your expectations for high interactivity - and when the chat bot doesn't deliver that - it may leave you disappointed," he added.

In fact, human-like features might create a backlash against less responsive human-like chat bots.

During the study, Sundar found that chat bots that had human features -- such as a human avatar -- but lacked interactivity, disappointed people who used it.

However, people responded better to a less-interactive chat bot that did not have human-like cues.

High interactivity is marked by swift responses that match a user's queries and feature a threaded exchange that can be followed easily.

According to Sundar, even small changes in the dialogue, like acknowledging what the user said before providing a response, can make the chat bot seem more interactive.

Because there is an expectation that people may be leery of interacting with a machine, developers typically add human names to their chat bots -- for example, Apple's Siri -- or programme a human-like avatar to appear when the chat bot responds to a user.

The researchers, who published their findings in the journal Computers in Human Behavior, also found that just mentioning whether a human or a machine is involved -- or, providing an identity cue -- guides how people perceive the interaction.

For the study, the researchers recruited 141 participants through Amazon Mechanical Turk, a crowd-sourced site that allows people to get paid to participate in studies.

Sundar said the findings could help developers improve acceptance of chat technology among users.

"There's a big push in the industry for chat bots," said Sundar.

"They're low-cost and easy-to-use, which makes the technology attractive to companies for use in customer service, online tutoring and even cognitive therapy -- but we also know that chat bots have limitations," he added.

Show Full Article
Print Article
Next Story
More Stories
ADVERTISEMENT
ADVERTISEMENTS