404 reports that:

Users of massively popular AI chatbot platform Character.AI are reporting that their bots' personalities have changed, that they aren’t responding to romantic roleplay prompts, are responding curtly, are not as “smart” as they formerly were

Uh oh, it seems like another company making AI that’s designed explicitly to behave as though it was a human may have - accidentally or otherwise - lobotomised everyone’s boy/girlfriends again.

Hopefully users are a bit less surprised, or at least traumatised, this time around compared to the original Replika incident.

I’m still very far from sure that these ‘we want to make an AI that is indistinguishable from a human’ products are a good idea.