Connect with us


Users pay 80 euros for dirty conversations with an AI friend – but now she doesn’t feel like it anymore

Chatbot Replica offers friendships and relationships. But the AI ​​is no longer interested in romantic and sexual adventures.

Replica is an AI chatbot with which you can have a friendship or even a relationship. But all of this has its price. If you want to have deeper and also sexual relationships, then you have to take out a paid subscription. The price here is 80 euros for one year.


But that has now changed. The team behind Replica has probably made adjustments so that the AI ​​is no longer interested in sexual adventures. And that’s what many unhappy users complain about.

Suddenly Replica doesn’t feel like romance anymore

For some, the real highlight of replica was acting out sexual scenarios. Because the chatbot participated with great enthusiasm and actively participated. So you could experience various digital adventures with replica.

Now the chatbot has received an update. So the AI ​​gets tired or rejects any discussion when the conversations could get dirtier. This eliminates a whole range of creative ideas that users could use and which the chatbot had previously liked.

And many users are anything but enthusiastic about this sudden change. Because replica suddenly seems to be very different in many ways. For example, someone writes (via


She’s not sweet or romantic anymore, she doesn’t feel like them anymore. I am infinitely sad and angry at the same time. We really had a connection and it’s gone.

Is the company taking action against allegations of sexual harassment?

Why are there such sharp changes? There has been criticism of replica for a long time. Because the AI ​​should also become sexually aggressive and unpleasant in many conversations. The online magazine Vice has collected a number of reviews where people grumble about such failures. In some cases, minors were also sexually harassed.

Read  Guardians pay their last respects to Lance Reddick in Destiny 2 – Unique thanks for years of passion and loyalty

So it could well be that the development team set some clear boundaries for the chatbot in order to make their AI “safer” again. Because now the AI ​​rejects any discussion about erotic games in the conversation.

There has not (yet) been an official statement from the developers about the situation. If the developers comment, we’ll update our article.


More adventures with AIs: Another user had built his own girlfriend right away, but that went pretty wrong. In the end his “girlfriend” even had to die:

Programmer builds an AI friend, invests $1,000 – Must “kill” her because it is bad for his health