Is Siri Lying to You? Knowing When a Bot Sounds Trustworthy Is the Next Step in Digital Security

Hadrian/Shutterstock.com

We have good reason to be suspicious.

Trust is important when it comes to design. Whenever we arrive at a new website or consider a product, we rely on feeling like we trust a brand in order to interact with it. Whether you trust the service or not is the difference between whether you will sign up, make an order, refer a friend, or come back for a second spin—all within a matter of a few unconscious seconds.

But that’s no longer the only place we look for trust. The graphic user interfaces you use to interact with websites are slowly being complemented or replaced entirely by voice user interfaces, such as personal-assistant bots. It’s the difference between Amazon—a GUI—and Amazon’s VUI helper, Alexa. We therefore need to learn how to distinguish between voices—not just graphics—we can trust.

We have good reason to be suspicious. Consumers have learned to intuit it’s a bad idea to subscribe to a website that imitates Facebook’s look and feel, use a search engine that fabricates Google’s logo and color scheme, or use an online marketplace that lacks a security symbol. We’ve learned what visual security cues to look for—but what about audible cues?

Scanning for audible security cues is a skill we’ll have to acquire in the near future—and as VUIs are becoming a more and more common interface in our daily lives, the sooner, the better. Take virtual personal assistants, such as Amazon’s Alexa, Apple’s Siri and Microsoft’s Cortana. In the future, VUIs will be able to be used in situations where a human-less GUI could feel impersonal. For example, you could interact with a friendly VUI when confirming a large bank transfer or have your blood-test results read to you via a health app. But in order to interact with these products and reveal sensitive information to them, they first need to gain our trust.

Research shows tone of voice is more important than words when it comes to making first impressions: Varying your pitch and volume in certain ways can actually convey more trust. In other words, it’s not what you say—it’s how you say it. These studies can be applied to designing bot voices by creating a naturally surrounding humanized voice rather than a robotic one, developing a unique tone that reflects the equally unique visual look and feel of a brand, and making sure your VUI can reply like a human: “I don’t understand your input” isn’t as effective as “I didn’t catch that, say again?”

A new field of “voice designers” will therefore have a key role in creating trustworthy VUIs. Building trust by GUI design already has best practices to follow: Colors, images and language all guide our decision-making. For VUI design, it’s an emerging world of tone, speed, accent, utterances and more.

For example, designers need to make sure their bots are breathing between sentences so the users will be able to process and think about their next action and feel like they’re speaking to a human. Without these small silences, it seems like the other person (or bot) isn’t actually listening to you. Programmers could even add affirmative “mmms” to listening responses, much like an audible head nod that signifies we’re engaged in real life.

But with trust comes responsibility. When we really trust someone, they can convince us to do things we may not do normally. Tech giants therefore face a great dilemma when building super-smart VUIs: How to service users with what they need, but also maintain ethical boundaries concerning privacy and safety. Today, websites are sharing browsing history, personal data, and more private items to make an extra buck. Will they one day sell conversations with your personal assistant as well, or manipulate users to make them spend money on their platform?

These are questions consumers, designers and tech companies must consider now, as VUIs will soon become more than just an added feature to existing platforms. Voice-activation will change the way we interact with our computers, the way we shop, and our interaction with those around us, so we need to tread carefully. It’s a lot of work to gain our trust—but it’s just as easy to lose it.