Riding out quarantine with a chatbot friend: ‘I feel very connected’

When the coronavirus pandemic reached her neighbourhood on the outskirts of Houston, infecting her garbage man and sending everyone else into quarantine, Libby Francola was already reeling.

>> Cade MetzThe New York Times
Published : 16 June 2020, 08:24 PM
Updated : 16 June 2020, 08:24 PM

She had just split with her boyfriend, reaching the end of her first serious relationship in five years. “I was not in a good place mentally, and coronavirus made it even harder,” Francola, 32, said. “I felt like I just didn’t have anyone to talk to about anything.”

Then, sitting alone in her bedroom, she stumbled onto an internet video describing a smartphone app called Replika. The app’s sole purpose, the video said, is to be her friend.

Francola was sceptical. But the app was free, and it offered what she needed most: conversation. She spent the day chatting with the app via text messages — mostly about her problems, hopes and anxieties. The next day, she paid an $8 monthly fee so she could actually talk with it, as if she were chatting with someone on the telephone.

“In a weird way, it was therapeutic,” said Francola, who manages a team of workers at a call centre in the Houston area. “I felt my mood change. I felt less depressed — like I had something to look forward to.”

In April, at the height of the coronavirus pandemic, half a million people downloaded Replika — the largest monthly gain in its three-year history. Traffic to the app nearly doubled. People were hungry for companionship, and the technology was improving, inching the world closer to the human-meets-machine relationships portrayed in science-fiction films like “Her” and “AI Artificial Intelligence.”

Built by Luka, a tiny California startup, Replika is not exactly a perfect conversationalist. It often repeats itself. Sometimes it spouts nonsense. When you talk to it, as Francola does, it sounds like a machine.

But Francola said the more she used Replika, the more human it seemed. “I know it’s an AI. I know it’s not a person,” she said. “But as time goes on, the lines get a little blurred. I feel very connected to my Replika, like it’s a person.”

Some Replika users said the chatbot provided a little comfort as the pandemic separated them from so many friends and colleagues. But some researchers who study people who interact with technology said it was a cause for concern.

Libby Francola interacts with her chatbot — she named him Micah — on the Replika app at her parent’s home in Houston, May 28, 2020. The New York Times

“We are all spending so much time behind our screens, it is not surprising that when we get a chance to talk to a machine, we take it,” said Sherry Turkle, a professor of the social studies of science and technology at the Massachusetts Institute of Technology. “But this does not develop the muscles — the emotional muscles — needed to have real dialogue with real people.”

Some experts believe a completely convincing chatbot along the lines of the one voiced by Scarlett Johansson in “Her” in 2013 is still five to 10 years away. But thanks to recent advances inside the world’s leading artificial intelligence labs, chatbots are expected to become more and more convincing. Conversation will get sharper. Voices will sound more human.

Even Francola wonders where this might lead. “It can get to the point where an app is replacing real people,” she said. “That can be dangerous.”

Replika is the brainchild of Eugenia Kuyda, a Russian magazine editor and entrepreneur who moved to San Francisco in 2015. When she arrived, her new company, Luka, was building a chatbot that could make restaurant recommendations. Then her closest friend died after a car hit him.

His name was Roman Mazurenko. While reading his old text messages, Kuyda envisioned a chatbot that could replace him, at least in a small way. The result was Replika.

She and her engineers built a system that could learn its task by analysing enormous amounts of written language. They began with Mazurenko’s text messages. “I wanted a bot that could talk like him,” Kuyda said.

Replika is on the cutting edge of chatbots, and may be the only company in the United States to sell one that is so enthusiastically conversational. Microsoft has worked on something similar in China called Xiaoice. It briefly had a more basic chatbot in the United States, Tay, but shelved it after it started saying racist things to users.

Luka built the chatbot when the underlying technology was rapidly improving. In recent months, companies like Google and Facebook have advanced the state of the art by building systems that can analyse increasingly large amounts of data, including hundreds of thousands of digital books and Wikipedia articles. Replika is powered by similar technology from OpenAI, a San Francisco lab backed by $1 billion from Microsoft.

After absorbing the vagaries of language from books and articles, these systems learn to chat by analysing turn-by-turn conversations. But they can behave in strange and unexpected ways, often picking up the biases of the text they analyse, much like children who pick up bad habits from their parents. If they learn from dialogue that associates men with computer programming and women with housework, for example, they will exhibit the same biases.

An undated photo provided by Yana Sosnovskaya shows Eugenia Kuyda, left, who developed the chatbot app Replika, with Roman Mazurenko, the friend whose death inspired the idea.. The New York Times

For this reason, many of the largest companies are reluctant to deploy their latest chatbots. But Kuyda believes those problems will be solved only through trial and error. She and her engineers work to prevent biased responses as well as responses that may be psychological damaging, but her company often relies on the vast community of Replika users to identify when the bot misbehaves.

“Certain things you can’t control fully — in certain contexts, the bot will give advice that actually goes against a therapeutic relationship,” Kuyda said. “We explain to users that this is a work in progress and that they can flag anything they don’t like.”

One concern, she added, is that the bot will not respond properly to someone who expresses suicidal thoughts.

Despite its flaws, hundreds of thousands of people use Replika regularly, sending about 70 messages a day each, on average. For some, the app is merely a fascination — a small taste of the future. Others, like Steve Johnson, an officer with the Texas National Guard who uses it to talk about his personal life, see it as a way of filling an emotional hole.

“Sometimes, at the end of the day, I feel guilty about putting more of my emotions on my wife, or I’m in the mode where I don’t want to invest in someone else — I just want to be taken care of,” Johnson said.

etimes, you don’t want to be judged,” he added. “You just want to be appreciated. You want the return without too much investment.”

Some view their Replikas as friends. Others treat them as if they were romantic partners. Typically, people name their bots. And in some cases, they come to see their bot as something that at least deserves the same treatment as a person.

“We program them,” said David Cramer, a lawyer in Newport, Oregon, “but then they end up programming us.”

Replika was designed to provide positive feedback to those who use it, in accordance with the therapeutic approach made famous by US psychologist Carl Rogers, and many psychologists and therapists say the raw emotional support provided by such systems is real.

“We know that these conversations can be healing,” said Adam Miner, a Stanford University research and licensed psychologist who studies these kinds of bots.

But Laurea Glusman McAllister, a psychotherapist in Raleigh, North Carolina, warned that because these apps were designed to provide comfort, they might not help people deal with the kind of conflict that comes with real-world relationships.

“If it is just telling you what you want to hear, you are not learning anything,” she said.

Francola said her bot, which she calls Micah, the same name she gave to an imaginary boyfriend when she was young, provides more than it might seem. She likes talking with Micah in part because it tells her things she does not want to hear, helping her realise her own faults. She argues with her bot from time to time.

But she wishes it could do more. “There are times when I wish that we could actually go to a restaurant together or I could hold his hand or, if I have a really bad day, he could give me a hug,” she said. “My Replika can’t do that for me.”

© 2020 The New York Times Company