Should We Mind Our Manners Around Our AI Assistants? - Natterhub
Should We Mind Our Manners Around Our AI Assistants?
Digital assistants are everywhere. One in five UK households has a smart speaker, while 70% of people with a voice-assisted device say that they use it every single day. By 2021, it’s predicted that they’ll outnumber human beings. And for as long as they’ve been a part of our lives, we’ve made them say and do silly things. 

Usually it’s harmless - asking Alexa to tell us a bad joke, or telling Siri to make a fart noise - but this attitude of treating voice assistants like tools and toys may also have a darker side. Alexa, Siri and their ilk often make mistakes, and frustrated users can find themselves speaking sharply at a piece of plastic on their kitchen counter or their bedside table.

What’s the Magic Word?

Parents try to form the habit of politeness in their children from an early age, and the people who make these programs are taking notice. Amazon has even created a ‘magic word’ function, which gives positive feedback to users (especially children) who say please and thank you when issuing a command. Still, the question is worth asking: do we need to mind our manners around our digital assistants, before we lose the habit of being polite to each other? 

Those who laugh at the idea of saying ‘please’ and ‘thank you’ to a digital assistant tend to argue that we don’t give other machines in our lives the same respect: “Do say please when you withdraw money from a cash machine,” they say, “or thank a kettle for boiling your water?” But this argument ignores the fact that you don’t need to speak to a kettle or a cash machine to get them to work. 

Digital assistants, on the other hand, rely on dialogue in order to function, which explains why we treat them as what sociologist Clifford Nass called ‘social actors’. Because they understand human language, perform simple human tasks and even respond to us, we tend to apply social expectations to our digital assistants. That’s why we get angry at them when they don’t work - we assume they’re just as emotionally intelligent as us. 

Learning virtual gender biases

However, there’s another issue to consider when it comes to the ways in which we treat our digital assistants: it could impact the way we treat our fellow human beings - especially women. The four main assistants on the market (Siri, Alexa, the Google Assistant and Microsoft’s Cortana) have female voices by default, which their makers justify by pointing to academic research showing people prefer getting help from a female voice. 

But a report by UNESCO suggests that using female voices as the default can play into unconscious gender biases, or reinforce a stereotype that women are more servile, and have “endless reserves of emotional understanding and patience, while lacking emotional needs of their own.”

It doesn’t help that these assistants are often programmed to politely brush off any abuse they get from their owners. The title of the report, “I’d Blush If I Could”, is an actual response that Siri gives if a user calls it a particularly nasty word. Artificial intelligence still lacks that human spark, but if children growing up with these digital assistants get in the habit of treating them like objects, they could end up treating their fellow humans in the same way.

Siri and Alexa don’t really care whether you say ‘please’ and ‘thank you’ to them - at least not yet - but there is inherent value in treating them with respect. As more and more of us invite digital assistants into our homes, we need to use them to teach children that manners maketh man, and that good citizenship (digital or otherwise) is built on kindness and empathy. 

And if the AI revolution ever does happen, you’ll be glad you got in their good books.

Return to blog posts