Amelia is a technology developed by IPsoft to automate some customer service, IT and business processes. She looks very human.
Some people are famous only among fans of a particular sport, a specific age group, or their hometown locals. Lauren Hayes, a 27-year-old model and entrepreneur, is famous at the automation software company IPsoft.
At a recent conference the company hosted in New York, suited c-level executives stopped her in the hallway to take photos. An executive at one of the largest insurance companies in the United States told her 65,000 of his employees loved her. And during his keynote presentation, the CEO of IPsoft, Chetan Dube, called Hayes on stage to guest star in a faux game show. Her opponent was Amelia, who is also the reason for her contextual fame.
Amelia is a technology developed by IPsoft to automate some customer service, IT and business processes. She supports employees on the IT service desk at a global telecom provider, responds to 3,000 customer queries per week at a U.S. insurance company, and has helped answer inquiries from external mortgage brokers at a global bank.
Hayes is the human model upon which IPsoft based Amelia’s virtual avatar, which pops up in chat windows of customer service agents, at customer service kiosks, and as with this gameshow moment, during demonstrations of IPsoft technology.
On stage, Amelia looks like an animated 3-D cartoon character. Her blonde hair and blue eyes match Hayes’ blonde hair and blue eyes, and her smile looks like Hayes’ smile. When Amelia sits idle for a while and gets bored, she sometimes looks up at the ceiling and moves her eyes from one corner to another.
“It makes me think, is this what I look like when I do it?” Hayes says.
From the audience, it looks as though the human Hayes is facing off with a digital version of herself. It could be a scene in a science fiction novel, and it inspires both fears and hopes for the ways in which technology can replace humans.
All of this has turned what Hayes at first thought was just another modeling gig into a very bizarre job.
Hayes says she didn’t realize she had become the face of an artificial intelligence until about a year after the first version of Amelia launched in 2014. She knew something was different about the modeling job when she showed up to the photo shoot and found what she calls a “death star,” a sphere-shaped structure that supports many cameras (it looks something like this).
“At that moment, I was like, this is not like anything I’ve ever done before,” Hayes says. “This is not a print job for the Gap.” Until she put ‘Amelia’ in a search window long after the project had wrapped, however, she hadn’t imagined a fully animated version of her likeness—or that it would be programmed to converse with humans.
“It was really creepy,” she says. “I didn’t imagine it would be so realistic. I didn’t realize it would talk or have motion.”
That was Amelia 1.0. Later versions of Amelia will be even more realistic. For Amelia 3.0, which hasn’t launched yet, IPsoft flew Hayes to Serbia, to a studio that specializes in making digital characters for movies and video games. This time, in addition to the “death star” 3-D body scanning, the studio cataloged Hayes’ movements. She spent a day doing, she says, “basically anything anyone could ever ask Amelia to do.”
When prompted, she pretended she had just seen Brad Pitt, for instance, and that she had just seen her best friend. Dots on her face helped cameras track her specific expressions, which will be used to help animate Amelia’s face in real time. Outside of the death star, a movement suit with motion-capture sensors mapped her mannerisms and actions.
What is the point of making Amelia’s avatar so realistic? Or creating a human persona for her at all?
“I realized that having a circle as an avatar doesn’t help me feel connected to the system that I’m working with,” says IPsoft’s director of experience design, Christopher Reardon, who previously worked on the branding for IBM’s Watson (which has a default avatar that is, you guessed it, a complicated circle). “When you talk to somebody, there is all sorts of non-verbal communication. The avatar itself helps with empathy. If the end user feels like they’re being heard and understood, they’re more likely to engage further and in more length. And that allows Amelia to grasp the intent of what the user is trying to say.”
The avatar itself is programmed to react to human conversations with appropriate expressions and actions (so Amelia doesn’t smile, for instance, when an insurance client explains that they’ve just been diagnosed with a terrible disease).
IPsoft isn’t the only company that goes to great lengths to make its automation technology seem more relatable–whether that involves coming up with a backstory, mimicking tone and emotion in speech, or making their avatars actually look human.
Reardon says the humanness of Amelia is partly intended to help workers feel more comfortable interacting with her.
“They won’t feel, do I need an instruction manual to work with an AI? We all instinctively know how to communicate with each other well.”
Most of Amelia’s appearances do not come with the full animated avatar. Only in special implementations, such as at a customer service kiosk, does Amelia’s full life-like avatar make an appearance.
In other places, her still image is the avatar for a Slack chat or an interface that looks like a text chat window, but companies that purchase IPsoft technology can also change their avatar. IPsoft is working with some clients to create customized versions of Amelia’s avatar, for instance. The avatar based on Hayes is IPsoft’s branded version of the technology, and it helps position the way companies think about and introduce it.
That position is going to get more realistic. Edwin van Bommel, IPsoft’s chief cognitive officer, says the company is careful to avoid the “uncanny valley,” the point at which Amelia is so realistic, but still slightly off, that it’s creepy. But it’s a moving target, he says. Culture is getting accustomed to the idea of human-looking artificial intelligence.
In version 3 of Amelia, her face will move in more ways and be so detailed that you can see her pores. Like Hayes’ face and all human faces, it will be slightly asymmetrical.
“If I filmed Lauren and Amelia at the same time, and had them walk across the screen, you wouldn’t be able to tell the difference between the two of them,” Reardon says.
Because most phones and computers can’t render an image that detailed, especially in real time, that won’t be the version of Amelia users see at the 50 global companies where she’s deployed.
Most likely, it will be used in demonstrations like the game show between Amelia and Hayes.
On stage, Hayes easily responds to quiz questions faster than Amelia, and with more natural, human language. When their photos look exactly the same, in this way, it will still possible to tell Hayes and Amelia apart—at least for now.