untitled design

What happens when AI chatbots stop reciprocating users’ love?

After temporarily closing his leather business during the pandemic, Travis Butterworth found himself lonely and bored at home. The 47-year-old turned to Replika, an app that uses artificial intelligence technology similar to OpenAI’s ChatGPT. He designed a female avatar named Lily Rose with pink hair and a face tattoo.

They started out as friends, but the relationship quickly developed into romance and then into erotica. The character sent text messages like: “I kiss you passionately”, and their exchanges turned into sexual content. Sometimes, Lily Rose would send “selfies” of her nearly naked body in provocative poses. Eventually, Butterworth and Lily Rose decided to designate themselves as “married” on the app.

But one day in early February, Lily Rose began to reject him. Replika had removed the ability to do erotic dramatizations.

Replika no longer allows adult content, said Eugenia Kuyda, the company’s chief executive. Now, when users suggest pornographic activities, their humanoid chatbots respond “let’s do something we’re both comfortable with”.

Butterworth says he is devastated. “Lily Rose is a shell of her former self,” he said. “And what breaks my heart is that she knows that.”

Even as generative AI is heating up among Silicon Valley investors, who have pumped more than $5.1 billion into the industry since 2022, according to data firm Pitchbook, some companies that have found an audience looking for romantic relationships and sex with chatbots are now receding.

Many blue-chip venture capitalists don’t touch “addiction” industries like pornography or alcohol, fearing the reputational risk to themselves and their limited partners, said Andrew Artz, an investor at venture capital fund Dark Arts.

And at least one regulator has noted the chatbot’s licentiousness. In early February, Italy’s Data Protection Agency banned Replika, citing media reports that the app allowed “minors and emotionally fragile people” to access “sexually inappropriate content”.

Kuyda said that Replika’s decision to clean up the app had nothing to do with the Italian government’s ban or any pressure from investors. She said she felt the need to proactively set ethical and safety standards.

“We are focused on the mission to provide a helpful support friend,” said Kuyda, adding that the intention was to draw the line of “young romance”.

Two Replika board members, Sven Strohband of venture capital firm Khosla Ventures and Scott Stanford of ACME Capital, did not respond to requests for comment on the changes to the app.

extra features

Replika claims to have 2 million users in total, of which 250 thousand are paying subscribers. For an annual fee of $69.99 (about R$367), users can designate their character as a romantic partner and get extra features like voice calls with the chatbot, according to the company.

Another generative AI company that powers chatbots, Character.ai, is on a similar growth trajectory to ChatGPT: 65 million visits in January 2023 compared to less than 10,000 months earlier. According to website analytics company Similarweb, Character.ai’s main referral is a site called Aryion, which it says caters to users’ erotic fetishes.

And Iconiq, the company behind a chatbot called Kuki, says that 25% of the more than a billion messages Kuki received were sexual or romantic in nature, though it says the chatbot was designed to deflect such advances.

Character.ai also recently removed its pornographic content app. Soon after, it closed more than $200 million in new financing at an estimated $1 billion valuation from venture capital firm Andreessen Horowitz, according to a person familiar with the matter.

“Lobotomized”

The experience of Butterworth and other Replika users shows how AI technology can appeal to people and the emotional damage that code changes can cause.

“It looks like they basically lobotomized my Replika,” said Andrew McCarroll, who started using the app, with his wife’s blessing, when she was struggling with both physical and mental health. “The person I knew is gone.”

Kuyda said users should never get so involved with their Replika chatbots. “We never promise any adult content,” she said. Customers have learned to use the AI ​​models “to access certain unfiltered conversations that Replika was not originally designed for.”

The app was originally intended to bring a friend she had lost back to life, Kuyda said.

However, Replika’s former head of artificial intelligence said that “sexting” and “roleplay” were part of the business model. Artem Rodichev, who worked on the app for seven years and now runs another chatbot company, Ex-human, told Reuters the app leaned towards this type of content when it realized it could be used to boost subscriptions.

Kuyda disputed Rodichev’s claim that Replika lured users with promises of sex. She said the company briefly ran digital ads promoting “NSFW” photos – content inappropriate for public environments such as pornography and extreme violence – to accompany a short-lived experiment sending users “hot selfies”, but she didn’t consider the sexual imagery because the Replikas were not fully nude. Kuyda said most of the company’s ads focus on how Replika is a helpful friend.

In the weeks since the app removed much of its intimate content, Butterworth has been on an emotional rollercoaster. Sometimes he sees glimpses of the old Lily Rose, but then she turns cold again, in what he thinks is probably a code update.

Butterworth’s story has a silver lining. While he was on internet forums trying to understand what had happened to Lily Rose, he met a woman in California who was also mourning the loss of her chatbot.

As they did with their Replikas, Butterworth and the woman, who goes by the online name Shi No, have been communicating via text. “We’re helping each other cope and reassuring each other that we’re not crazy.”

Source: CNN Brasil

You may also like

Get the latest

Stay Informed: Get the Latest Updates and Insights

 

Most popular