app

Bangkok, Thailand - July 30, 2019 : Siri, Apple's voice-activated digital assistant, tells iPhone user to ask her by showing the text "Go ahead, I'm listening" on the display. Bangkok, Thailand - July 30, 2019 : Siri, Apple's voice-activated digital assistant, tells iPhone user to ask her by showing the text "Go ahead, I'm listening" on the display.

Should you be polite to artificial intelligence?

Play icon
Ben Knight
Ben Knight,

Minding your manners could be the way to go when interacting with chatbots and virtual assistants – at least for the moment.

It’s fairly standard – not to mention common courtesy – to lead with politeness most of the time when interacting with people. But should good manners extend beyond our interactions with others to minding our ps and qs when it comes to artificial intelligence (AI)?

Whether we’re happy about it or not, AI, from the generative ChatGPT to personal assistants like Siri, is becoming increasingly woven into our daily lives. The question is no longer if, but in what ways we interact with these systems.

Dr Eduardo Benitez Sandoval is a social robotics researcher at the School of Art & Design, UNSW Arts, Design & Architecture, whose expertise spans different aspects of the field, including human-robot interaction. He says there is growing interest in understanding the dynamics of how machines interact with humans in meaningful ways.

“From a design perspective, we have principles in the field of social robotics that say human-robotic interactions should be enjoyable, reciprocal, inclusive, and universal,” Dr Sandoval says. “I think we could say one facet of this is interacting in respectful and considerate ways, in other words, being polite.”

Media enquiries

For enquiries about this story and interview requests, please contact Ben Knight, News & Content Coordinator, UNSW Arts, Design & Architecture.

ʳDzԱ:(02) 9065 4915
:b.knight@unsw.edu.au


Innovations in generative AI to sound more human-like is changing the nature of our interactions with it. Photo: Adobe Stock.

The point of politeness

Conversing with a machine as if it were a person might seem strange or even pointless to some. After all, AI is powered by algorithms and does not have feelings or consciousness (yet). Nonetheless, rapid innovations in generative AI to sound more natural, friendly and human-like when it responds to us are fundamentally changing the nature of our interactions.  

“Of course, they are machines, so they can’t really ‘care’ how we prompt it, and we should not forget about this,” Dr Sandoval says. “But if we take a car, for example, we take care of it by driving carefully, servicing the car and other kinds of actions akin to politeness.”

Dr Sandoval says politeness in human social interactions has functional benefits in maintaining cohesion. As many AI systems are trained on human behaviour, it may also serve us when interacting with AI.

“Politeness has practical effects in helping avoid conflicts, promoting diplomacy and allowing cultural exchange,” Dr Sandoval says. “If this way of expressing ourselves towards artificial agents is feeding into the large language models behind them, I think extending this to our interactions with robots is useful as a starting point.”

Dr Sandoval says politeness may not only improve the user experience but also enable generative AI to deliver higher-quality results and outputs.

“Politeness might be advantageous for us as users because it somewhat encourages clarity and, therefore, efficiency, which is what we expect when interacting with machines,” Dr Sandoval says. “When we’re polite, we can be sure in our prompts, and as a result, we might receive more detailed, understandable or helpful responses from our AI.”

Dr Sandoval says treating AI respectfully may also help reinforce good social norms and expectations.

“Our interactions with robots and artificial agents give us a better understanding of the human condition,” Dr Sandoval says. “Politeness towards AI can reflect how we value and respect other entities, both living and non-living world, and model the sort of behaviour we wish to see in the world.”

When we’re polite, we can be sure in our prompts, and as a result, we might receive more detailed, understandable or helpful responses from our AI.
Dr Eduardo Benitez Sandoval

Regulating the robots

Dr Sandoval says our interactions with AI are also constantly training it on how to behave in the future. With this view, being polite can help maintain a norm of civil human-robot interactions.

“It’s important to remember that technology is never neutral,” Dr Sandoval says. “Our values and ethics are ingrained in what we create, perhaps, even more, when it comes to the large language models behind ChatGPT and personal assistants like Siri.

“It’s why, as social roboticists, we propose the design of virtuous robots in a way that creates a tangible benefit for people, that enhances human skills and helps us to grow as humans.”

Dr Sandoval says one area currently lacking in the field is a regulatory body that gives broad oversight of the design of social agents. Among the main concerns is that as AI gets better at mimicking human language patterns, users might become addicted and develop parasocial relationships with their virtual assistants and chatbots.

“Interactive design is one of these few professions that doesn’t have a body of professionals regulating the practice such as medicine or law,” Dr Sandoval says. “I do think there are some risks that the same principles of manipulation and persuasion that we know exist within something that has evolved quickly, like social media, could also seep into robots if we move too fast.

“I think regulation in the form of  panels of experts should be assembled to discuss these kinds of ethical questions, provide advice to the decision-makers and help educate the public.”

While the full extent of the consequences of AI becoming intertwined in our lives remains to be seen, for now, leading with politeness could be the way to go.

“A good piece of advice for life is that it never hurts to be polite,” Dr Sandoval says.