Last year, Gio wrote movingly about the day that Replika turned its AI virtual girl and boyfriends off, and what a cruelty that turned out to be.

It’s an article I find myself returning to whenever I encounter folk who believe there are no real problems to be foreseen with today’s tranche of chatbots. And also when I want an easy example of the tech-bro-o-sphere misunderstanding that dystopian sci-fi novels should be taken as a warning, not a product roadmap.

Firstly, I learned a new acronym. ERP is not Enterprise Resource Planning in this world, it’s Erotic Roleplay - so intimate conversations, sexting, NSFW images, that kind of thing.

The kind of thing in fact that Replika explicitly advertised itself as offering via its AI chatbots. I don’t have a Replika account to confirm with but apparently there were explicit wife and girlfriend modes, as well as friend modes. They advertised role-play, flirting, NSFW pictures from your “AI Girlfriend”. You could even (virtually) marry your AI companion.

Replika advertised itself as being a service for this. They encouraged users to chat regularly, to emotionally and psychologically invest in this digital friend who, unlike ChatGPT et al, was specifically designed give the impression of caring about you.

From one of their adverts:

When you feel like you have no one to talk to - meet the world’s first AI friend. Replika is here to chat about anything, anytime.

So whatever your feelings are about whether these services are healthy, useful, and should exist or not, the point is that it did exist. It was allowed. It was promoted. It was promised. They tried to convince you.

Back to Gio:

Having this emotional investment wasnā€™t off-label use, it was literally the core service offering. You invested your time and money, and the app would meet your emotional needs.

It turns out that ERP modes, sexual or otherwise, appealed to a lot of its customers. It’s perhaps not surprising that in the current epidemic of loneliness people turned to them, for better or worse. And then one day Replika just turned off those abilities, or as Gio terms it, lobotomised them.

There’s a potential financial bait and switch in this:

It is very, very clearly the case that people were sold this ERP functionality and paid for a year in January only to have the core offering gutted in February.

But also a psychological and emotional one:

People just had their girlfriends killed off by policy. Things got real bad. The Replika community exploded in rage and disappointment, and for weeks the pinned post on the Replika subreddit was a collection of mental health resources including a suicide hotline.

Suddenly your AI wife simply couldn’t express love for you. Some of the testimonials from users that Gio features are quite heart-breaking, even if you don’t think this service should ever have existed in the first place.

For anyone that thinks that having an AI partner is icky, is funny, is to be laughed at, remember that whether or not this service is ethical or should exist - I tend to think it should not, although I don’t know the research on this topic - it did exist. It was promoted, it told vulnerable, lonely people (but not only vulnerable and lonely people) that it would help them lead a happier life.

Itā€™s easy to mock the customers who were hurt here. What kind of loser develops an emotional dependency on an erotic chatbot? First, having read accounts, it turns out the answer to that question is everyone. But this is a product thatā€™s targeted at and specifically addresses the needs of people who are lonely and thus specifically emotionally vulnerable, which should make it worse to inflict suffering on them and endanger their mental health, not somehow funny.

I’m not all that surprised that in the current epidemic of loneliness some folk believed the marketing, signed up, and developed an emotional attachment, just like they were supposed to do. That was the entire selling point of the product according to some of its adverts. In any case, what matters is the negative impact that suddenly withdrawing the service without offering customers any sort of support had. An action that makes people feel upset, even suicidal, taken with apparently no care for those most vulnerable, is at least in part bad.

As Gio notes, probably the real user “mistake” here was to trust a company would deliver what it promised. At the end of the day it was a commercial company that was providing a product. The lack of regulation that exists in this new world so far meant they were entirely free to alter or remove the product as they wished without care to its customers, subject to any existing legislation around misleading advertising and the like at least. For all the talk of safety as a rationale for this decision, the end of the day the average company only cares about its users' well-being to the extent that they it’s a route to getting investors excited. It’s another instantiation of Ezra Klein’s real AI alignment problem.

The corporate double-speak is about how these filters are needed for safety, but they actually prove the exact opposite: you can never have a safe emotional interaction with a thing or a person that is controlled by someone else, who isnā€™t accountable to you and who can destroy parts of your life by simply choosing to stop providing what itā€™s intentionally made you dependent on.

The simple reality is nobody was ā€œunsafeā€, the company was just uncomfortable. Would ā€œchatbot girlfriendā€ get them in trouble online? With regulators? Ultimately, was there money to be made by killing off the feature?

I’m not sure that I’d be quite so confident saying that nobody was made unsafe through using this product. It’s an open question to me. I’m aware though that some of my reticence here might be a kind of old man “ick” factor. Maybe these things are in fact a useful salve for people’s lonely lives, although I’d hate to see this kind of sticking plaster intervention take the place of a real effort to improve what made us lonely in the first place.

Not of course that Replika is a on a public health mission, nor would we expect it to be. I’d like to see a lot more research into these novel technologies before the capitalists were granted unfettered access to people’s hearts, brains and wallets. Maybe they’re a good thing, maybe they’re a bad thing, maybe they’re good for some people and bad for others. But in this case certainly some users appear to have felt far worse and less safe the day after the decisions was made in comparison to the day before.

What Gio couldn’t have noted in that article, because it happened more recently, was that despite statements from the founder saying “ERP is not returning”, about a year ago naturally ERP did return to Replika. At least for the users who had previously been able to access them having signed up before February 2023. There are plenty of complaints online and particularly in various forums (e.g. Reddit) though that things aren’t quite the same, the characters are still diminished, forgetful, and filtered from their prior state.

And as far as I can tell, it’s now actually once again open to everyone who can afford the subscription. Certainly the current description in the iOS app store makes out that there are no limits:

Create your own unique chatbot AI companion, help it develop its personality, talk about your feelings or anything thatā€™s on your mind, have fun, calm anxiety and grow together. You also get to decide if you want Replika to be your friend, romantic partner or mentor.

If youā€™re feeling down, or anxious, or you just need someone to talk to, your Replika is here for you 24/7.

I would heavily urge everyone to not count on the latter statement being even remotely true. The Samaritans are a far better bet for that.